DELE ST1504 CA1 Part A: Convolutional Neural Network
Objective:
To build image classifiers using deep learning networks for 15 types of vegetables. Two different input sizes (31x31 pixels and 128x128 pixels) will be considered, and the images will be converted to grayscale (1 channel) for both input sizes. The classification accuracies of the two models will be compared and discussed.
Background Information:
In this task, we will focus on developing image classifiers for a dataset containing 15 types of vegetables. The goal is to accurately classify the vegetables based on grayscale images with two different input sizes: 31x31 pixels and 128x128 pixels. By comparing the performance of models trained with these different input sizes, we can gain insights into the impact of image resolution on classification accuracy. This assignment's constraints are done to represent any difficuties that are possible in real-life data gathering, such as in the medical field.
Additional Information:
-nil-
Output Variable:
The main output variable of this task is the classification accuracy of the two models for each input size (31x31 and 128x128 pixels). Additionally, the task aims to provide insights and a discussion on the performance differences between the two models and their suitability for the given image classification problem.
import gc
import numpy as np
import pandas as pd
import seaborn as sns
import tensorflow as tf
import matplotlib.pyplot as plt
from collections import Counter
from itertools import permutations
import os, random, pathlib, warnings, itertools, math
import CNN_function as cnnf
#importing tensorflow libraries
from tensorflow.keras.layers import Input
from tensorflow.keras.layers import Layer
from tensorflow.keras.layers import Dense
from tensorflow.keras.layers import Conv2D
from tensorflow.keras.layers import Dropout
from tensorflow.keras.layers import Flatten
from tensorflow.keras.layers import ReLU
from tensorflow.keras.utils import plot_model
from tensorflow.keras.models import load_model
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import concatenate
from tensorflow.keras.applications import VGG16
from tensorflow.keras.models import clone_model
from tensorflow.keras.layers import MaxPooling2D
from tensorflow.keras.layers import AveragePooling2D
from tensorflow.keras.callbacks import EarlyStopping
from tensorflow.keras.layers import BatchNormalization
from tensorflow.keras.layers import GlobalAveragePooling2D
from tensorflow.keras.layers import ActivityRegularization
from tensorflow.keras.preprocessing.image import ImageDataGenerator
#importing other libraries
from IPython.display import clear_output
from sklearn.metrics import confusion_matrix
from sklearn.preprocessing import MultiLabelBinarizer
from sklearn.utils.class_weight import compute_class_weight
#ignoring redundant warnings
from warnings import simplefilter
simplefilter(action='ignore', category=UserWarning)
simplefilter(action='ignore', category=FutureWarning)
# Fix random seed for reproducibility
seed = 1
np.random.seed(seed)
tf.random.set_seed(seed)
tf.keras.utils.set_random_seed(0)
gpus = tf.config.experimental.list_physical_devices('GPU')
for gpu in gpus:
print(tf.config.experimental.get_device_details(gpu))
tf.config.experimental.set_memory_growth(gpu, True)
print(f"There are {len(gpus)} GPU(s) present.")
{'device_name': 'NVIDIA GeForce RTX 3060', 'compute_capability': (8, 6)}
There are 1 GPU(s) present.
dataset='./Dataset for CA1 part A/Dataset for CA1 part A'
train_folder = os.path.join(dataset,"train")
test_folder = os.path.join(dataset,"validation")
validation_folder = os.path.join(dataset,"test")
test_sample_count = pd.DataFrame(columns = ["name", "count"])
train_sample_count = pd.DataFrame(columns = ["name", "count"])
validation_sample_count = pd.DataFrame(columns = ["name", "count"])
def count_files(rootdir):
count_dataframe = pd.DataFrame(columns=["name", "count"])
for i, path in enumerate(pathlib.Path(rootdir).iterdir()):
if path.is_dir(): # count files only in directories
count = len([name for name in os.listdir(path) if os.path.isfile(os.path.join(path, name))])
count_dataframe.loc[i] = [path.name, count] # Using loc to add rows
return count_dataframe
train_sample_count = count_files(train_folder)
test_sample_count = count_files(test_folder)
validation_sample_count = count_files(validation_folder)
batch_size = None
normal_train_data = tf.keras.utils.image_dataset_from_directory(train_folder, batch_size = batch_size, seed = seed, image_size = (224, 224))
normal_test_data = tf.keras.utils.image_dataset_from_directory(test_folder, batch_size = batch_size, seed = seed, image_size = (224, 224))
normal_validation_data = tf.keras.utils.image_dataset_from_directory(validation_folder, batch_size = batch_size, seed = seed, image_size = (224, 224))
normal_full_data = normal_train_data.concatenate(normal_test_data)
normal_full_data = normal_full_data.concatenate(normal_validation_data)
print(f"There are {len(normal_train_data.class_names)} classes, {list(normal_train_data.class_names)}")
class_names = list(normal_train_data.class_names)
def normalize_image(image, label):
image = tf.cast(image, tf.float32) / 255.0
return image, label
normal_train_data = normal_train_data.map(normalize_image)
normal_test_data = normal_test_data.map(normalize_image)
normal_validation_data = normal_validation_data.map(normalize_image)
normal_full_data = normal_full_data.map(normalize_image)
batch_size = 32
normal_test_data = normal_test_data.batch(batch_size)
normal_train_data = normal_train_data.batch(batch_size)
normal_validation_data = normal_validation_data.batch(batch_size)
Found 9028 files belonging to 15 classes. Found 3000 files belonging to 15 classes. Found 3000 files belonging to 15 classes. There are 15 classes, ['Bean', 'Bitter_Gourd', 'Bottle_Gourd', 'Brinjal', 'Broccoli', 'Cabbage', 'Capsicum', 'Carrot', 'Cauliflower', 'Cucumber', 'Papaya', 'Potato', 'Pumpkin', 'Radish', 'Tomato']
Creating functions to display dataset images.
Using previously created functions for EDA.
cnnf.class_image(normal_full_data, class_names, cmap = "gray")
cnnf.average_image(normal_full_data)
From the above, which is a average of all the pictures in our dataset, we can see that there are no outstanding features in the dataset. The dataset also appears to be uniform, with the dominant color being green. The dataset also seems to be homogenous, with most of the details being shared accross the images.
cnnf.average_images_per_class(normal_full_data, class_names)
From the above, we can see that there are 3 distinct colors in the dataset, with Capsicums, Carrots, Potatoes, and Tomatoes having high red values, while Cauliflowers, Pumpkins, and Radishes being more white, meaning that they are higher in all color channels. The rest of the vegetables have greener average images, showing that they have a higher load in the green color channel.
fig, axs = plt.subplots(1,3, sharey=True, figsize = (20,5))
cnnf.image_count_graph(train_sample_count , ax = axs[0], set_name = "Training")
cnnf.image_count_graph(test_sample_count, ax = axs[1], set_name = "Testing")
cnnf.image_count_graph(validation_sample_count, ax = axs[2], set_name = "Validation")
plt.show()
From the above, we can see that the dataset is imbalanced.
mean_val, std_val = cnnf.compute_running_mean_std(normal_full_data)
print(f"The mean brightness of all images is {mean_val}")
print(f"The standard deviation of all images is {std_val}")
The mean brightness of all images is 0.4197029846784955 The standard deviation of all images is 0.24243864096348536
(Note: we scaled the scalars of the image to be from 0 to 1 earlier.)
To help us identify the best data preparation pipeline, we shall establish a baseline model, then test it out on the different steps, to see which steps are the right ones to take to improve the accuracy of the model. As a baseline model, we shall use a simple convolutional neural network, using tensorflow's Conv2D layers. These are better suited for image classification tasks compared to other layers such as the Dense layers, due to their ability to work on local regions of the image, allowing it to detect features such as edges and patterns. This allows it to understand the structure and content of an image better than other types of models. In addition, we shall use Sparse Categorical Crossentropy over other types of entropy loss functions, as it is better suited to the nature of our dataset. Sparse Categorical Crossentropy is useful when handling a larger amount of classes, like our use case here. If we were to one-hot encode the labels, like is needed for Categorical Crossentropy, we would have generated larger, sparser matrices, which are inefficent when it comes to memory usage. Sparse Categorical Crossentropy also directly uses class labels instead of one-hot encoded vectores, allowing us to skip a step in the data preparation pipeline, simplifying the data pre-preprocessing.
The formula for Sparse Categorical Crossentropy is as follows:
\begin{aligned} L &= -\sum_{i=1}^{N} \log(p_{y_i}) \\ \text{where} \\ L &\text{ is the loss} \\ N &\text{ is the number of samples in the dataset} \\ y_i &\text{ is the true class label for the } i\text{-th sample} \\ p_{y_i} &\text{ is the predicted probability of the true class } y_i \text{ for the } i\text{-th sample} \end{aligned}Furthermore, we shall use the Adam optimizer first. ADAptive Moment Estimation is a extension of Stochastic Gradient Descent. It combines ideas from two other optimization algorithms, namely Momentum and RMSProp. It is a good choice for us to start with, given it's "self-learning" properties, which means that it's learning rates are adaptive, relative to how frequently a parameter gets updated while training, making it suitable for problems with sparse gradients or with noisy data, like ours. It is also omputationally efficient with relatively low memory requirements, making it suitable for problems with large datasets or parameters.
earlyStopper = EarlyStopping(monitor = 'val_loss', patience = 10)
num_classes = len(class_names)
baseline_model = Sequential()
baseline_model.add(Conv2D(16, (3, 3), activation='relu'))
baseline_model.add(MaxPooling2D((2, 2)))
baseline_model.add(Conv2D(32, (3, 3), activation='relu'))
baseline_model.add(GlobalAveragePooling2D())
baseline_model.add(Dense(num_classes, activation='softmax'))
baseline_model.compile(loss = "sparse_categorical_crossentropy", optimizer = "adam", metrics = ["accuracy"])
baseline31_model = clone_model(baseline_model)
baseline31luminosity_model = clone_model(baseline_model)
baseline31desaturation_model = clone_model(baseline_model)
baseline128_model = clone_model(baseline_model)
baseline128luminosity_model = clone_model(baseline_model)
baseline128desaturation_model = clone_model(baseline_model)
baseline31_model.compile(loss = "sparse_categorical_crossentropy", optimizer = "adam", metrics = ["accuracy"])
baseline31luminosity_model.compile(loss = "sparse_categorical_crossentropy", optimizer = "adam", metrics = ["accuracy"])
baseline31desaturation_model.compile(loss = "sparse_categorical_crossentropy", optimizer = "adam", metrics = ["accuracy"])
baseline128_model.compile(loss = "sparse_categorical_crossentropy", optimizer = "adam", metrics = ["accuracy"])
baseline128luminosity_model.compile(loss = "sparse_categorical_crossentropy", optimizer = "adam", metrics = ["accuracy"])
baseline128desaturation_model.compile(loss = "sparse_categorical_crossentropy", optimizer = "adam", metrics = ["accuracy"])
To prepare the data to make it ready for the model, we need to
31 x 31 and 128 x 128 pixels.We shall start by resizing the data, then moving onto investigating the different methods that can be used to greyscale the data.
def resize_image(image, label, target_height=31, target_width=31):
image = tf.image.resize(image, [target_height, target_width])
return image, label
res31_train_data = normal_train_data.map(lambda image, label: resize_image(image, label, 31, 31), num_parallel_calls=tf.data.AUTOTUNE)
res31_test_data = normal_test_data.map(lambda image, label: resize_image(image, label, 31, 31), num_parallel_calls=tf.data.AUTOTUNE)
res31_validation_data = normal_validation_data.map(lambda image, label: resize_image(image, label, 31, 31), num_parallel_calls=tf.data.AUTOTUNE)
res31_full_data = res31_train_data.concatenate(res31_test_data)
res31_full_data = res31_full_data.concatenate(res31_validation_data)
res128_train_data = normal_train_data.map(lambda image, label: resize_image(image, label, 128, 128), num_parallel_calls=tf.data.AUTOTUNE)
res128_test_data = normal_test_data.map(lambda image, label: resize_image(image, label, 128, 128), num_parallel_calls=tf.data.AUTOTUNE)
res128_validation_data = normal_validation_data.map(lambda image, label: resize_image(image, label, 128, 128), num_parallel_calls=tf.data.AUTOTUNE)
res128_full_data = res128_train_data.concatenate(res128_test_data)
res128_full_data = res128_full_data.concatenate(res128_validation_data)
We shall start by using the simplest method to greyscale the images, by averaging out all 3 color channels, to obtain the average brightness of each image.
def convert_to_greyscale(image, label):
image = tf.image.rgb_to_grayscale(image)
return image, label
res31greyscale_train_data = res31_train_data.map(lambda image, label: convert_to_greyscale(image, label), num_parallel_calls= tf.data.AUTOTUNE)
res31greyscale_test_data = res31_test_data.map(lambda image, label: convert_to_greyscale(image, label), num_parallel_calls= tf.data.AUTOTUNE)
res31greyscale_validation_data = res31_validation_data.map(lambda image, label: convert_to_greyscale(image, label), num_parallel_calls= tf.data.AUTOTUNE)
res31greyscale_full_data = res31greyscale_train_data.concatenate(res31greyscale_test_data)
res31greyscale_full_data = res31greyscale_full_data.concatenate(res31greyscale_validation_data)
res31greyscale_full_data = res31greyscale_full_data.unbatch()
res128greyscale_train_data = res128_train_data.map(lambda image, label: convert_to_greyscale(image, label), num_parallel_calls= tf.data.AUTOTUNE)
res128greyscale_test_data = res128_test_data.map(lambda image, label: convert_to_greyscale(image, label), num_parallel_calls= tf.data.AUTOTUNE)
res128greyscale_validation_data = res128_validation_data.map(lambda image, label: convert_to_greyscale(image, label), num_parallel_calls= tf.data.AUTOTUNE)
res128greyscale_full_data = res128greyscale_train_data.concatenate(res128greyscale_test_data)
res128greyscale_full_data = res128greyscale_full_data.concatenate(res128greyscale_validation_data)
res128greyscale_full_data = res128greyscale_full_data.unbatch()
Next, we shall use the luminosity method to greyscale the images. This works by adjusting the strength of the color channels based on the perception of the human eye, since it does not respond equally to all colors. The formula for the luminosity method is as follows $$ \text Grey = 0.21×Red+0.72×Green+0.07×Blue $$
def convert_to_greyscale_luminosity(image, label):
luminosity_weights = tf.constant([0.21, 0.72, 0.07], shape=(1, 1, 3), dtype=image.dtype)
grayscale_image = tf.reduce_sum(image * luminosity_weights, axis=-1, keepdims=True)
return grayscale_image, label
res31luminosity_train_data = res31_train_data.map(lambda image, label: convert_to_greyscale_luminosity(image, label), num_parallel_calls= tf.data.AUTOTUNE)
res31luminosity_test_data = res31_test_data.map(lambda image, label: convert_to_greyscale_luminosity(image, label), num_parallel_calls= tf.data.AUTOTUNE)
res31luminosity_validation_data = res31_validation_data.map(lambda image, label: convert_to_greyscale_luminosity(image, label), num_parallel_calls= tf.data.AUTOTUNE)
res31luminosity_full_data = res31luminosity_train_data.concatenate(res31luminosity_test_data)
res31luminosity_full_data = res31luminosity_full_data.concatenate(res31luminosity_validation_data)
res31luminosity_full_data = res31luminosity_full_data.unbatch()
res128luminosity_train_data = res128_train_data.map(lambda image, label: convert_to_greyscale_luminosity(image, label), num_parallel_calls= tf.data.AUTOTUNE)
res128luminosity_test_data = res128_test_data.map(lambda image, label: convert_to_greyscale_luminosity(image, label), num_parallel_calls= tf.data.AUTOTUNE)
res128luminosity_validation_data = res128_validation_data.map(lambda image, label: convert_to_greyscale_luminosity(image, label), num_parallel_calls= tf.data.AUTOTUNE)
res128luminosity_full_data = res128luminosity_train_data.concatenate(res128luminosity_test_data)
res128luminosity_full_data = res128luminosity_full_data.concatenate(res128luminosity_validation_data)
res128luminosity_full_data = res128luminosity_full_data.unbatch()
Lastly, we shall use the desaturation method to greyscale the images. The method calculates the grey level by averaging the maximum and minimum values of the RGB components for each pixel. The formula for the desaturation method is as follows: $$ \text{Gray} = \frac{\text{max(Red, Green, Blue)} + \text{min(Red, Green, Blue)}}{2} $$
def convert_to_greyscale_desaturation(image, label):
max_rgb = tf.reduce_max(image, axis=-1, keepdims=True)
min_rgb = tf.reduce_min(image, axis=-1, keepdims=True)
grayscale_image = (max_rgb + min_rgb) / 2
return grayscale_image, label
res31desaturation_train_data = res31_train_data.map(lambda image, label: convert_to_greyscale_desaturation(image, label), num_parallel_calls= tf.data.AUTOTUNE)
res31desaturation_test_data = res31_test_data.map(lambda image, label: convert_to_greyscale_desaturation(image, label), num_parallel_calls= tf.data.AUTOTUNE)
res31desaturation_validation_data = res31_validation_data.map(lambda image, label: convert_to_greyscale_desaturation(image, label), num_parallel_calls= tf.data.AUTOTUNE)
res31desaturation_full_data = res31desaturation_train_data.concatenate(res31desaturation_test_data)
res31desaturation_full_data = res31desaturation_full_data.concatenate(res31desaturation_validation_data)
res31desaturation_full_data = res31desaturation_full_data.unbatch()
res128desaturation_train_data = res128_train_data.map(lambda image, label: convert_to_greyscale_desaturation(image, label), num_parallel_calls= tf.data.AUTOTUNE)
res128desaturation_test_data = res128_test_data.map(lambda image, label: convert_to_greyscale_desaturation(image, label), num_parallel_calls= tf.data.AUTOTUNE)
res128desaturation_validation_data = res128_validation_data.map(lambda image, label: convert_to_greyscale_desaturation(image, label), num_parallel_calls= tf.data.AUTOTUNE)
res128desaturation_full_data = res128desaturation_train_data.concatenate(res128desaturation_test_data)
res128desaturation_full_data = res128desaturation_full_data.concatenate(res128desaturation_validation_data)
res128desaturation_full_data = res128desaturation_full_data.unbatch()
history = baseline31_model.fit(res31greyscale_train_data, validation_data = res31greyscale_validation_data, epochs = 150, callbacks = earlyStopper)
cnnf.class_image(res31greyscale_full_data, class_names, cmap = "gray")
cnnf.model_metric_graph(history, baseline31_model, res31greyscale_validation_data, class_names= class_names,name = "baseline model for 31x31 image (normally greyscaled)")
Epoch 1/150 283/283 [==============================] - 6s 17ms/step - loss: 2.6087 - accuracy: 0.1283 - val_loss: 2.7094 - val_accuracy: 0.1067 Epoch 2/150 283/283 [==============================] - 5s 18ms/step - loss: 2.5015 - accuracy: 0.1799 - val_loss: 2.6120 - val_accuracy: 0.1570 Epoch 3/150 283/283 [==============================] - 5s 17ms/step - loss: 2.4232 - accuracy: 0.2180 - val_loss: 2.4869 - val_accuracy: 0.1710 Epoch 4/150 283/283 [==============================] - 5s 17ms/step - loss: 2.3518 - accuracy: 0.2312 - val_loss: 2.4270 - val_accuracy: 0.1720 Epoch 5/150 283/283 [==============================] - 5s 17ms/step - loss: 2.2896 - accuracy: 0.2518 - val_loss: 2.3637 - val_accuracy: 0.1890 Epoch 6/150 283/283 [==============================] - 5s 16ms/step - loss: 2.2534 - accuracy: 0.2575 - val_loss: 2.3032 - val_accuracy: 0.2070 Epoch 7/150 283/283 [==============================] - 5s 16ms/step - loss: 2.2163 - accuracy: 0.2841 - val_loss: 2.3038 - val_accuracy: 0.2103 Epoch 8/150 283/283 [==============================] - 5s 17ms/step - loss: 2.1858 - accuracy: 0.3001 - val_loss: 2.2461 - val_accuracy: 0.2330 Epoch 9/150 283/283 [==============================] - 5s 17ms/step - loss: 2.1552 - accuracy: 0.3130 - val_loss: 2.2042 - val_accuracy: 0.2780 Epoch 10/150 283/283 [==============================] - 5s 17ms/step - loss: 2.1246 - accuracy: 0.3238 - val_loss: 2.2150 - val_accuracy: 0.2727 Epoch 11/150 283/283 [==============================] - 5s 16ms/step - loss: 2.0957 - accuracy: 0.3368 - val_loss: 2.1553 - val_accuracy: 0.2747 Epoch 12/150 283/283 [==============================] - 5s 17ms/step - loss: 2.0554 - accuracy: 0.3487 - val_loss: 2.1234 - val_accuracy: 0.3200 Epoch 13/150 283/283 [==============================] - 5s 17ms/step - loss: 2.0180 - accuracy: 0.3636 - val_loss: 2.0647 - val_accuracy: 0.3263 Epoch 14/150 283/283 [==============================] - 5s 16ms/step - loss: 1.9950 - accuracy: 0.3700 - val_loss: 2.0547 - val_accuracy: 0.3450 Epoch 15/150 283/283 [==============================] - 5s 16ms/step - loss: 1.9582 - accuracy: 0.3825 - val_loss: 2.0025 - val_accuracy: 0.3643 Epoch 16/150 283/283 [==============================] - 5s 16ms/step - loss: 1.9265 - accuracy: 0.3983 - val_loss: 2.0133 - val_accuracy: 0.3367 Epoch 17/150 283/283 [==============================] - 5s 16ms/step - loss: 1.8962 - accuracy: 0.4072 - val_loss: 1.9982 - val_accuracy: 0.3367 Epoch 18/150 283/283 [==============================] - 5s 16ms/step - loss: 1.8799 - accuracy: 0.4066 - val_loss: 1.9766 - val_accuracy: 0.3503 Epoch 19/150 283/283 [==============================] - 5s 16ms/step - loss: 1.8576 - accuracy: 0.4202 - val_loss: 1.9206 - val_accuracy: 0.3940 Epoch 20/150 283/283 [==============================] - 5s 16ms/step - loss: 1.8365 - accuracy: 0.4243 - val_loss: 1.9021 - val_accuracy: 0.3980 Epoch 21/150 283/283 [==============================] - 5s 16ms/step - loss: 1.8239 - accuracy: 0.4287 - val_loss: 1.8985 - val_accuracy: 0.3957 Epoch 22/150 283/283 [==============================] - 5s 15ms/step - loss: 1.8086 - accuracy: 0.4343 - val_loss: 2.0177 - val_accuracy: 0.3470 Epoch 23/150 283/283 [==============================] - 5s 16ms/step - loss: 1.7901 - accuracy: 0.4456 - val_loss: 1.9101 - val_accuracy: 0.3717 Epoch 24/150 283/283 [==============================] - 5s 16ms/step - loss: 1.7761 - accuracy: 0.4472 - val_loss: 1.9037 - val_accuracy: 0.3717 Epoch 25/150 283/283 [==============================] - 5s 16ms/step - loss: 1.7670 - accuracy: 0.4543 - val_loss: 1.8465 - val_accuracy: 0.4087 Epoch 26/150 283/283 [==============================] - 5s 16ms/step - loss: 1.7536 - accuracy: 0.4502 - val_loss: 1.8077 - val_accuracy: 0.4460 Epoch 27/150 283/283 [==============================] - 5s 16ms/step - loss: 1.7484 - accuracy: 0.4567 - val_loss: 1.8285 - val_accuracy: 0.4203 Epoch 28/150 283/283 [==============================] - 5s 16ms/step - loss: 1.7325 - accuracy: 0.4681 - val_loss: 1.8090 - val_accuracy: 0.4287 Epoch 29/150 283/283 [==============================] - 5s 16ms/step - loss: 1.7319 - accuracy: 0.4600 - val_loss: 1.7781 - val_accuracy: 0.4400 Epoch 30/150 283/283 [==============================] - 5s 16ms/step - loss: 1.7125 - accuracy: 0.4721 - val_loss: 1.7838 - val_accuracy: 0.4480 Epoch 31/150 283/283 [==============================] - 5s 16ms/step - loss: 1.7066 - accuracy: 0.4752 - val_loss: 1.8057 - val_accuracy: 0.4363 Epoch 32/150 283/283 [==============================] - 5s 16ms/step - loss: 1.6919 - accuracy: 0.4788 - val_loss: 1.7779 - val_accuracy: 0.4337 Epoch 33/150 283/283 [==============================] - 5s 15ms/step - loss: 1.6917 - accuracy: 0.4753 - val_loss: 1.7909 - val_accuracy: 0.4217 Epoch 34/150 283/283 [==============================] - 5s 16ms/step - loss: 1.6899 - accuracy: 0.4823 - val_loss: 1.7527 - val_accuracy: 0.4563 Epoch 35/150 283/283 [==============================] - 5s 16ms/step - loss: 1.6691 - accuracy: 0.4883 - val_loss: 1.8104 - val_accuracy: 0.4203 Epoch 36/150 283/283 [==============================] - 5s 16ms/step - loss: 1.6608 - accuracy: 0.4864 - val_loss: 1.7318 - val_accuracy: 0.4477 Epoch 37/150 283/283 [==============================] - 5s 16ms/step - loss: 1.6540 - accuracy: 0.4896 - val_loss: 1.7157 - val_accuracy: 0.4647 Epoch 38/150 283/283 [==============================] - 5s 16ms/step - loss: 1.6431 - accuracy: 0.4952 - val_loss: 1.7511 - val_accuracy: 0.4343 Epoch 39/150 283/283 [==============================] - 5s 15ms/step - loss: 1.6419 - accuracy: 0.4986 - val_loss: 1.7892 - val_accuracy: 0.4290 Epoch 40/150 283/283 [==============================] - 5s 16ms/step - loss: 1.6364 - accuracy: 0.4939 - val_loss: 1.7902 - val_accuracy: 0.4263 Epoch 41/150 283/283 [==============================] - 5s 16ms/step - loss: 1.6258 - accuracy: 0.4992 - val_loss: 1.6750 - val_accuracy: 0.4800 Epoch 42/150 283/283 [==============================] - 5s 16ms/step - loss: 1.6255 - accuracy: 0.5002 - val_loss: 1.6711 - val_accuracy: 0.4957 Epoch 43/150 283/283 [==============================] - 5s 16ms/step - loss: 1.6124 - accuracy: 0.5058 - val_loss: 1.6840 - val_accuracy: 0.4713 Epoch 44/150 283/283 [==============================] - 5s 16ms/step - loss: 1.6002 - accuracy: 0.5078 - val_loss: 1.6816 - val_accuracy: 0.4607 Epoch 45/150 283/283 [==============================] - 5s 16ms/step - loss: 1.6137 - accuracy: 0.5030 - val_loss: 1.6778 - val_accuracy: 0.4753 Epoch 46/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5950 - accuracy: 0.5090 - val_loss: 1.6333 - val_accuracy: 0.4980 Epoch 47/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5888 - accuracy: 0.5128 - val_loss: 1.6336 - val_accuracy: 0.4937 Epoch 48/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5840 - accuracy: 0.5195 - val_loss: 1.6674 - val_accuracy: 0.4807 Epoch 49/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5760 - accuracy: 0.5157 - val_loss: 1.6587 - val_accuracy: 0.4793 Epoch 50/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5711 - accuracy: 0.5164 - val_loss: 1.6311 - val_accuracy: 0.4953 Epoch 51/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5539 - accuracy: 0.5265 - val_loss: 1.6442 - val_accuracy: 0.4897 Epoch 52/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5526 - accuracy: 0.5216 - val_loss: 1.6261 - val_accuracy: 0.5063 Epoch 53/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5465 - accuracy: 0.5261 - val_loss: 1.6170 - val_accuracy: 0.4937 Epoch 54/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5444 - accuracy: 0.5234 - val_loss: 1.5920 - val_accuracy: 0.5203 Epoch 55/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5425 - accuracy: 0.5256 - val_loss: 1.5835 - val_accuracy: 0.5250 Epoch 56/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5312 - accuracy: 0.5305 - val_loss: 1.5985 - val_accuracy: 0.5043 Epoch 57/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5251 - accuracy: 0.5296 - val_loss: 1.5820 - val_accuracy: 0.5137 Epoch 58/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5224 - accuracy: 0.5329 - val_loss: 1.5914 - val_accuracy: 0.5053 Epoch 59/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5122 - accuracy: 0.5388 - val_loss: 1.5863 - val_accuracy: 0.5047 Epoch 60/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5098 - accuracy: 0.5356 - val_loss: 1.5730 - val_accuracy: 0.5207 Epoch 61/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5034 - accuracy: 0.5364 - val_loss: 1.5517 - val_accuracy: 0.5230 Epoch 62/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5033 - accuracy: 0.5407 - val_loss: 1.5497 - val_accuracy: 0.5233 Epoch 63/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4957 - accuracy: 0.5398 - val_loss: 1.5672 - val_accuracy: 0.5257 Epoch 64/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4886 - accuracy: 0.5400 - val_loss: 1.6713 - val_accuracy: 0.4773 Epoch 65/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4848 - accuracy: 0.5414 - val_loss: 1.5683 - val_accuracy: 0.5123 Epoch 66/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4815 - accuracy: 0.5443 - val_loss: 1.5558 - val_accuracy: 0.5240 Epoch 67/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4765 - accuracy: 0.5488 - val_loss: 1.5360 - val_accuracy: 0.5307 Epoch 68/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4726 - accuracy: 0.5463 - val_loss: 1.6549 - val_accuracy: 0.4770 Epoch 69/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4654 - accuracy: 0.5459 - val_loss: 1.5443 - val_accuracy: 0.5197 Epoch 70/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4566 - accuracy: 0.5577 - val_loss: 1.5394 - val_accuracy: 0.5253 Epoch 71/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4609 - accuracy: 0.5498 - val_loss: 1.5157 - val_accuracy: 0.5167 Epoch 72/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4585 - accuracy: 0.5498 - val_loss: 1.5891 - val_accuracy: 0.5100 Epoch 73/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4491 - accuracy: 0.5525 - val_loss: 1.5177 - val_accuracy: 0.5203 Epoch 74/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4459 - accuracy: 0.5573 - val_loss: 1.5231 - val_accuracy: 0.5300 Epoch 75/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4443 - accuracy: 0.5564 - val_loss: 1.5276 - val_accuracy: 0.5267 Epoch 76/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4400 - accuracy: 0.5574 - val_loss: 1.5083 - val_accuracy: 0.5317 Epoch 77/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4338 - accuracy: 0.5603 - val_loss: 1.5699 - val_accuracy: 0.5077 Epoch 78/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4376 - accuracy: 0.5583 - val_loss: 1.5679 - val_accuracy: 0.5077 Epoch 79/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4284 - accuracy: 0.5565 - val_loss: 1.4781 - val_accuracy: 0.5487 Epoch 80/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4133 - accuracy: 0.5681 - val_loss: 1.5038 - val_accuracy: 0.5290 Epoch 81/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4163 - accuracy: 0.5647 - val_loss: 1.5086 - val_accuracy: 0.5347 Epoch 82/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4130 - accuracy: 0.5648 - val_loss: 1.5596 - val_accuracy: 0.5000 Epoch 83/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4139 - accuracy: 0.5655 - val_loss: 1.5484 - val_accuracy: 0.5113 Epoch 84/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4041 - accuracy: 0.5712 - val_loss: 1.4790 - val_accuracy: 0.5407 Epoch 85/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4074 - accuracy: 0.5646 - val_loss: 1.4755 - val_accuracy: 0.5367 Epoch 86/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3997 - accuracy: 0.5647 - val_loss: 1.4636 - val_accuracy: 0.5430 Epoch 87/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3894 - accuracy: 0.5757 - val_loss: 1.5272 - val_accuracy: 0.5220 Epoch 88/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3929 - accuracy: 0.5710 - val_loss: 1.4348 - val_accuracy: 0.5507 Epoch 89/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3867 - accuracy: 0.5692 - val_loss: 1.4321 - val_accuracy: 0.5543 Epoch 90/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3881 - accuracy: 0.5753 - val_loss: 1.4614 - val_accuracy: 0.5453 Epoch 91/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3795 - accuracy: 0.5712 - val_loss: 1.4380 - val_accuracy: 0.5513 Epoch 92/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3759 - accuracy: 0.5780 - val_loss: 1.4079 - val_accuracy: 0.5580 Epoch 93/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3724 - accuracy: 0.5795 - val_loss: 1.5134 - val_accuracy: 0.5207 Epoch 94/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3683 - accuracy: 0.5771 - val_loss: 1.4150 - val_accuracy: 0.5633 Epoch 95/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3632 - accuracy: 0.5784 - val_loss: 1.4174 - val_accuracy: 0.5607 Epoch 96/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3575 - accuracy: 0.5847 - val_loss: 1.4380 - val_accuracy: 0.5477 Epoch 97/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3670 - accuracy: 0.5759 - val_loss: 1.4202 - val_accuracy: 0.5563 Epoch 98/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3530 - accuracy: 0.5885 - val_loss: 1.4821 - val_accuracy: 0.5480 Epoch 99/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3556 - accuracy: 0.5811 - val_loss: 1.3923 - val_accuracy: 0.5717 Epoch 100/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3506 - accuracy: 0.5821 - val_loss: 1.4417 - val_accuracy: 0.5447 Epoch 101/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3571 - accuracy: 0.5791 - val_loss: 1.4013 - val_accuracy: 0.5630 Epoch 102/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3499 - accuracy: 0.5847 - val_loss: 1.4781 - val_accuracy: 0.5390 Epoch 103/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3385 - accuracy: 0.5899 - val_loss: 1.4259 - val_accuracy: 0.5503 Epoch 104/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3356 - accuracy: 0.5887 - val_loss: 1.4038 - val_accuracy: 0.5563 Epoch 105/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3293 - accuracy: 0.5935 - val_loss: 1.3685 - val_accuracy: 0.5837 Epoch 106/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3261 - accuracy: 0.5916 - val_loss: 1.4137 - val_accuracy: 0.5567 Epoch 107/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3272 - accuracy: 0.5930 - val_loss: 1.3865 - val_accuracy: 0.5740 Epoch 108/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3176 - accuracy: 0.5935 - val_loss: 1.5122 - val_accuracy: 0.5290 Epoch 109/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3202 - accuracy: 0.5969 - val_loss: 1.4436 - val_accuracy: 0.5400 Epoch 110/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3277 - accuracy: 0.5888 - val_loss: 1.4327 - val_accuracy: 0.5460 Epoch 111/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3160 - accuracy: 0.5912 - val_loss: 1.3982 - val_accuracy: 0.5610 Epoch 112/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3211 - accuracy: 0.5955 - val_loss: 1.4000 - val_accuracy: 0.5500 Epoch 113/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3125 - accuracy: 0.5955 - val_loss: 1.5424 - val_accuracy: 0.5090 Epoch 114/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3120 - accuracy: 0.5965 - val_loss: 1.3906 - val_accuracy: 0.5600 Epoch 115/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3001 - accuracy: 0.5996 - val_loss: 1.4020 - val_accuracy: 0.5623 Epoch 116/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2993 - accuracy: 0.5981 - val_loss: 1.4299 - val_accuracy: 0.5460 Epoch 117/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3001 - accuracy: 0.5998 - val_loss: 1.4500 - val_accuracy: 0.5437 Epoch 118/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2970 - accuracy: 0.5995 - val_loss: 1.3762 - val_accuracy: 0.5657 Epoch 119/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3047 - accuracy: 0.5943 - val_loss: 1.4198 - val_accuracy: 0.5477 Epoch 120/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2911 - accuracy: 0.6006 - val_loss: 1.3959 - val_accuracy: 0.5517 Epoch 121/150 283/283 [==============================] - 5s 17ms/step - loss: 1.2860 - accuracy: 0.6057 - val_loss: 1.3949 - val_accuracy: 0.5623 Epoch 122/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2855 - accuracy: 0.6008 - val_loss: 1.4563 - val_accuracy: 0.5350 Epoch 123/150 283/283 [==============================] - 5s 17ms/step - loss: 1.2889 - accuracy: 0.6005 - val_loss: 1.3314 - val_accuracy: 0.5890 Epoch 124/150 283/283 [==============================] - 5s 17ms/step - loss: 1.2798 - accuracy: 0.6047 - val_loss: 1.4215 - val_accuracy: 0.5487 Epoch 125/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2863 - accuracy: 0.6028 - val_loss: 1.3136 - val_accuracy: 0.5873 Epoch 126/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2789 - accuracy: 0.6062 - val_loss: 1.4181 - val_accuracy: 0.5587 Epoch 127/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2767 - accuracy: 0.6091 - val_loss: 1.3925 - val_accuracy: 0.5563 Epoch 128/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2722 - accuracy: 0.6086 - val_loss: 1.3293 - val_accuracy: 0.5833 Epoch 129/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2790 - accuracy: 0.6005 - val_loss: 1.3498 - val_accuracy: 0.5820 Epoch 130/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2665 - accuracy: 0.6082 - val_loss: 1.3468 - val_accuracy: 0.5703 Epoch 131/150 283/283 [==============================] - 5s 17ms/step - loss: 1.2647 - accuracy: 0.6073 - val_loss: 1.3768 - val_accuracy: 0.5607 Epoch 132/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2608 - accuracy: 0.6100 - val_loss: 1.3159 - val_accuracy: 0.5957 Epoch 133/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2586 - accuracy: 0.6115 - val_loss: 1.3180 - val_accuracy: 0.5933 Epoch 134/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2551 - accuracy: 0.6108 - val_loss: 1.3253 - val_accuracy: 0.5780 Epoch 135/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2529 - accuracy: 0.6128 - val_loss: 1.2984 - val_accuracy: 0.5963 Epoch 136/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2546 - accuracy: 0.6131 - val_loss: 1.3657 - val_accuracy: 0.5690 Epoch 137/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2525 - accuracy: 0.6150 - val_loss: 1.3358 - val_accuracy: 0.5740 Epoch 138/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2478 - accuracy: 0.6175 - val_loss: 1.2823 - val_accuracy: 0.6043 Epoch 139/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2464 - accuracy: 0.6135 - val_loss: 1.3091 - val_accuracy: 0.5920 Epoch 140/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2476 - accuracy: 0.6153 - val_loss: 1.3220 - val_accuracy: 0.5820 Epoch 141/150 283/283 [==============================] - 5s 17ms/step - loss: 1.2366 - accuracy: 0.6160 - val_loss: 1.3006 - val_accuracy: 0.5887 Epoch 142/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2420 - accuracy: 0.6154 - val_loss: 1.2842 - val_accuracy: 0.6020 Epoch 143/150 283/283 [==============================] - 5s 17ms/step - loss: 1.2376 - accuracy: 0.6184 - val_loss: 1.2658 - val_accuracy: 0.6073 Epoch 144/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2449 - accuracy: 0.6152 - val_loss: 1.3947 - val_accuracy: 0.5573 Epoch 145/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2322 - accuracy: 0.6155 - val_loss: 1.3416 - val_accuracy: 0.5817 Epoch 146/150 283/283 [==============================] - 5s 17ms/step - loss: 1.2248 - accuracy: 0.6165 - val_loss: 1.3243 - val_accuracy: 0.5817 Epoch 147/150 283/283 [==============================] - 5s 17ms/step - loss: 1.2296 - accuracy: 0.6172 - val_loss: 1.2479 - val_accuracy: 0.6117 Epoch 148/150 283/283 [==============================] - 5s 17ms/step - loss: 1.2267 - accuracy: 0.6203 - val_loss: 1.2624 - val_accuracy: 0.6163 Epoch 149/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2231 - accuracy: 0.6196 - val_loss: 1.3174 - val_accuracy: 0.5803 Epoch 150/150 283/283 [==============================] - 5s 17ms/step - loss: 1.2162 - accuracy: 0.6185 - val_loss: 1.3386 - val_accuracy: 0.5730
history = baseline31luminosity_model.fit(res31luminosity_train_data, validation_data = res31luminosity_validation_data, epochs = 150, callbacks = earlyStopper)
cnnf.class_image(res31luminosity_full_data, class_names, cmap = "gray")
cnnf.model_metric_graph(history, baseline31luminosity_model, res31luminosity_validation_data, class_names= class_names,name = "baseline model for 31x31 image (greyscaled using luminosity method)")
Epoch 1/150 283/283 [==============================] - 5s 17ms/step - loss: 2.6063 - accuracy: 0.1224 - val_loss: 2.7064 - val_accuracy: 0.0887 Epoch 2/150 283/283 [==============================] - 5s 17ms/step - loss: 2.5028 - accuracy: 0.1694 - val_loss: 2.6157 - val_accuracy: 0.1403 Epoch 3/150 283/283 [==============================] - 5s 16ms/step - loss: 2.4363 - accuracy: 0.1951 - val_loss: 2.5341 - val_accuracy: 0.1583 Epoch 4/150 283/283 [==============================] - 5s 16ms/step - loss: 2.3626 - accuracy: 0.2148 - val_loss: 2.4281 - val_accuracy: 0.1800 Epoch 5/150 283/283 [==============================] - 5s 16ms/step - loss: 2.2896 - accuracy: 0.2312 - val_loss: 2.4209 - val_accuracy: 0.1653 Epoch 6/150 283/283 [==============================] - 5s 16ms/step - loss: 2.2509 - accuracy: 0.2455 - val_loss: 2.2842 - val_accuracy: 0.1897 Epoch 7/150 283/283 [==============================] - 5s 17ms/step - loss: 2.2125 - accuracy: 0.2550 - val_loss: 2.2573 - val_accuracy: 0.2180 Epoch 8/150 283/283 [==============================] - 5s 17ms/step - loss: 2.1848 - accuracy: 0.2724 - val_loss: 2.2667 - val_accuracy: 0.2220 Epoch 9/150 283/283 [==============================] - 5s 17ms/step - loss: 2.1559 - accuracy: 0.2868 - val_loss: 2.1842 - val_accuracy: 0.2447 Epoch 10/150 283/283 [==============================] - 5s 17ms/step - loss: 2.1314 - accuracy: 0.2957 - val_loss: 2.1990 - val_accuracy: 0.2530 Epoch 11/150 283/283 [==============================] - 5s 17ms/step - loss: 2.1034 - accuracy: 0.3127 - val_loss: 2.1321 - val_accuracy: 0.2900 Epoch 12/150 283/283 [==============================] - 5s 17ms/step - loss: 2.0757 - accuracy: 0.3291 - val_loss: 2.1684 - val_accuracy: 0.2723 Epoch 13/150 283/283 [==============================] - 5s 17ms/step - loss: 2.0557 - accuracy: 0.3382 - val_loss: 2.2160 - val_accuracy: 0.2803 Epoch 14/150 283/283 [==============================] - 5s 17ms/step - loss: 2.0358 - accuracy: 0.3487 - val_loss: 2.1231 - val_accuracy: 0.2893 Epoch 15/150 283/283 [==============================] - 5s 16ms/step - loss: 2.0184 - accuracy: 0.3611 - val_loss: 2.0848 - val_accuracy: 0.3327 Epoch 16/150 283/283 [==============================] - 5s 17ms/step - loss: 1.9924 - accuracy: 0.3736 - val_loss: 2.0386 - val_accuracy: 0.3367 Epoch 17/150 283/283 [==============================] - 5s 16ms/step - loss: 1.9723 - accuracy: 0.3805 - val_loss: 2.0254 - val_accuracy: 0.3260 Epoch 18/150 283/283 [==============================] - 5s 17ms/step - loss: 1.9552 - accuracy: 0.3826 - val_loss: 2.0397 - val_accuracy: 0.3317 Epoch 19/150 283/283 [==============================] - 5s 16ms/step - loss: 1.9360 - accuracy: 0.3893 - val_loss: 2.0688 - val_accuracy: 0.3373 Epoch 20/150 283/283 [==============================] - 5s 16ms/step - loss: 1.9181 - accuracy: 0.3985 - val_loss: 1.9631 - val_accuracy: 0.3843 Epoch 21/150 283/283 [==============================] - 5s 17ms/step - loss: 1.9002 - accuracy: 0.4080 - val_loss: 2.0367 - val_accuracy: 0.3423 Epoch 22/150 283/283 [==============================] - 5s 17ms/step - loss: 1.8802 - accuracy: 0.4146 - val_loss: 1.9451 - val_accuracy: 0.3590 Epoch 23/150 283/283 [==============================] - 5s 16ms/step - loss: 1.8603 - accuracy: 0.4251 - val_loss: 1.8835 - val_accuracy: 0.3950 Epoch 24/150 283/283 [==============================] - 5s 17ms/step - loss: 1.8391 - accuracy: 0.4331 - val_loss: 2.0038 - val_accuracy: 0.3477 Epoch 25/150 283/283 [==============================] - 5s 16ms/step - loss: 1.8276 - accuracy: 0.4341 - val_loss: 1.9153 - val_accuracy: 0.3927 Epoch 26/150 283/283 [==============================] - 5s 17ms/step - loss: 1.8031 - accuracy: 0.4421 - val_loss: 1.8470 - val_accuracy: 0.4197 Epoch 27/150 283/283 [==============================] - 5s 16ms/step - loss: 1.7879 - accuracy: 0.4466 - val_loss: 1.8781 - val_accuracy: 0.4130 Epoch 28/150 283/283 [==============================] - 5s 17ms/step - loss: 1.7720 - accuracy: 0.4541 - val_loss: 1.8665 - val_accuracy: 0.4037 Epoch 29/150 283/283 [==============================] - 5s 16ms/step - loss: 1.7549 - accuracy: 0.4598 - val_loss: 1.8055 - val_accuracy: 0.4380 Epoch 30/150 283/283 [==============================] - 5s 16ms/step - loss: 1.7379 - accuracy: 0.4682 - val_loss: 1.8561 - val_accuracy: 0.4157 Epoch 31/150 283/283 [==============================] - 5s 16ms/step - loss: 1.7312 - accuracy: 0.4647 - val_loss: 1.8467 - val_accuracy: 0.4053 Epoch 32/150 283/283 [==============================] - 5s 17ms/step - loss: 1.7161 - accuracy: 0.4715 - val_loss: 1.7961 - val_accuracy: 0.4210 Epoch 33/150 283/283 [==============================] - 5s 16ms/step - loss: 1.7030 - accuracy: 0.4754 - val_loss: 1.7741 - val_accuracy: 0.4437 Epoch 34/150 283/283 [==============================] - 5s 17ms/step - loss: 1.6863 - accuracy: 0.4730 - val_loss: 1.7565 - val_accuracy: 0.4577 Epoch 35/150 283/283 [==============================] - 5s 17ms/step - loss: 1.6876 - accuracy: 0.4750 - val_loss: 1.7890 - val_accuracy: 0.4283 Epoch 36/150 283/283 [==============================] - 5s 17ms/step - loss: 1.6768 - accuracy: 0.4826 - val_loss: 1.7582 - val_accuracy: 0.4450 Epoch 37/150 283/283 [==============================] - 5s 16ms/step - loss: 1.6627 - accuracy: 0.4867 - val_loss: 1.7538 - val_accuracy: 0.4487 Epoch 38/150 283/283 [==============================] - 5s 17ms/step - loss: 1.6499 - accuracy: 0.4896 - val_loss: 1.7765 - val_accuracy: 0.4407 Epoch 39/150 283/283 [==============================] - 5s 16ms/step - loss: 1.6397 - accuracy: 0.4975 - val_loss: 1.7106 - val_accuracy: 0.4563 Epoch 40/150 283/283 [==============================] - 5s 16ms/step - loss: 1.6334 - accuracy: 0.4976 - val_loss: 1.6866 - val_accuracy: 0.4717 Epoch 41/150 283/283 [==============================] - 5s 17ms/step - loss: 1.6239 - accuracy: 0.4980 - val_loss: 1.6846 - val_accuracy: 0.4733 Epoch 42/150 283/283 [==============================] - 5s 16ms/step - loss: 1.6240 - accuracy: 0.4944 - val_loss: 1.6955 - val_accuracy: 0.4837 Epoch 43/150 283/283 [==============================] - 5s 17ms/step - loss: 1.6092 - accuracy: 0.4998 - val_loss: 1.7516 - val_accuracy: 0.4473 Epoch 44/150 283/283 [==============================] - 5s 17ms/step - loss: 1.6010 - accuracy: 0.5084 - val_loss: 1.6539 - val_accuracy: 0.4833 Epoch 45/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5951 - accuracy: 0.5078 - val_loss: 1.6856 - val_accuracy: 0.4770 Epoch 46/150 283/283 [==============================] - 5s 17ms/step - loss: 1.5920 - accuracy: 0.5119 - val_loss: 1.6379 - val_accuracy: 0.4807 Epoch 47/150 283/283 [==============================] - 5s 17ms/step - loss: 1.5848 - accuracy: 0.5090 - val_loss: 1.6709 - val_accuracy: 0.4770 Epoch 48/150 283/283 [==============================] - 5s 17ms/step - loss: 1.5782 - accuracy: 0.5078 - val_loss: 1.6400 - val_accuracy: 0.4830 Epoch 49/150 283/283 [==============================] - 5s 17ms/step - loss: 1.5727 - accuracy: 0.5101 - val_loss: 1.6595 - val_accuracy: 0.4860 Epoch 50/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5651 - accuracy: 0.5167 - val_loss: 1.6012 - val_accuracy: 0.5020 Epoch 51/150 283/283 [==============================] - 5s 17ms/step - loss: 1.5524 - accuracy: 0.5214 - val_loss: 1.6126 - val_accuracy: 0.4930 Epoch 52/150 283/283 [==============================] - 5s 17ms/step - loss: 1.5445 - accuracy: 0.5249 - val_loss: 1.6313 - val_accuracy: 0.4940 Epoch 53/150 283/283 [==============================] - 5s 17ms/step - loss: 1.5446 - accuracy: 0.5232 - val_loss: 1.5938 - val_accuracy: 0.5047 Epoch 54/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5408 - accuracy: 0.5207 - val_loss: 1.6513 - val_accuracy: 0.4727 Epoch 55/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5284 - accuracy: 0.5266 - val_loss: 1.5955 - val_accuracy: 0.5010 Epoch 56/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5226 - accuracy: 0.5265 - val_loss: 1.6530 - val_accuracy: 0.4763 Epoch 57/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5287 - accuracy: 0.5254 - val_loss: 1.5674 - val_accuracy: 0.5117 Epoch 58/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5216 - accuracy: 0.5328 - val_loss: 1.6385 - val_accuracy: 0.4830 Epoch 59/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5190 - accuracy: 0.5332 - val_loss: 1.5619 - val_accuracy: 0.5093 Epoch 60/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5066 - accuracy: 0.5312 - val_loss: 1.5612 - val_accuracy: 0.5170 Epoch 61/150 283/283 [==============================] - 5s 17ms/step - loss: 1.5029 - accuracy: 0.5327 - val_loss: 1.5763 - val_accuracy: 0.5107 Epoch 62/150 283/283 [==============================] - 5s 17ms/step - loss: 1.5049 - accuracy: 0.5290 - val_loss: 1.5507 - val_accuracy: 0.5157 Epoch 63/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4920 - accuracy: 0.5379 - val_loss: 1.5690 - val_accuracy: 0.5170 Epoch 64/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4970 - accuracy: 0.5353 - val_loss: 1.5696 - val_accuracy: 0.5057 Epoch 65/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4844 - accuracy: 0.5433 - val_loss: 1.6313 - val_accuracy: 0.4803 Epoch 66/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4754 - accuracy: 0.5401 - val_loss: 1.5134 - val_accuracy: 0.5340 Epoch 67/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4852 - accuracy: 0.5407 - val_loss: 1.5626 - val_accuracy: 0.5203 Epoch 68/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4709 - accuracy: 0.5449 - val_loss: 1.4907 - val_accuracy: 0.5453 Epoch 69/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4602 - accuracy: 0.5525 - val_loss: 1.6792 - val_accuracy: 0.4683 Epoch 70/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4569 - accuracy: 0.5484 - val_loss: 1.6204 - val_accuracy: 0.4880 Epoch 71/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4628 - accuracy: 0.5436 - val_loss: 1.7631 - val_accuracy: 0.4347 Epoch 72/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4589 - accuracy: 0.5482 - val_loss: 1.5313 - val_accuracy: 0.5217 Epoch 73/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4501 - accuracy: 0.5491 - val_loss: 1.5195 - val_accuracy: 0.5290 Epoch 74/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4433 - accuracy: 0.5572 - val_loss: 1.4911 - val_accuracy: 0.5353 Epoch 75/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4414 - accuracy: 0.5521 - val_loss: 1.4999 - val_accuracy: 0.5317 Epoch 76/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4333 - accuracy: 0.5548 - val_loss: 1.4409 - val_accuracy: 0.5663 Epoch 77/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4336 - accuracy: 0.5544 - val_loss: 1.6517 - val_accuracy: 0.4847 Epoch 78/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4335 - accuracy: 0.5609 - val_loss: 1.4710 - val_accuracy: 0.5537 Epoch 79/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4222 - accuracy: 0.5595 - val_loss: 1.5298 - val_accuracy: 0.5260 Epoch 80/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4119 - accuracy: 0.5647 - val_loss: 1.4650 - val_accuracy: 0.5450 Epoch 81/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4157 - accuracy: 0.5562 - val_loss: 1.4640 - val_accuracy: 0.5470 Epoch 82/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4094 - accuracy: 0.5605 - val_loss: 1.4421 - val_accuracy: 0.5540 Epoch 83/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4065 - accuracy: 0.5649 - val_loss: 1.4964 - val_accuracy: 0.5330 Epoch 84/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4005 - accuracy: 0.5656 - val_loss: 1.4465 - val_accuracy: 0.5380 Epoch 85/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3962 - accuracy: 0.5671 - val_loss: 1.4374 - val_accuracy: 0.5463 Epoch 86/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3880 - accuracy: 0.5693 - val_loss: 1.5104 - val_accuracy: 0.5217 Epoch 87/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3904 - accuracy: 0.5703 - val_loss: 1.5098 - val_accuracy: 0.5243 Epoch 88/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3819 - accuracy: 0.5696 - val_loss: 1.4578 - val_accuracy: 0.5443 Epoch 89/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3796 - accuracy: 0.5735 - val_loss: 1.5387 - val_accuracy: 0.5147 Epoch 90/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3793 - accuracy: 0.5719 - val_loss: 1.5192 - val_accuracy: 0.4980 Epoch 91/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3730 - accuracy: 0.5706 - val_loss: 1.4750 - val_accuracy: 0.5350 Epoch 92/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3670 - accuracy: 0.5776 - val_loss: 1.4148 - val_accuracy: 0.5577 Epoch 93/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3601 - accuracy: 0.5772 - val_loss: 1.4200 - val_accuracy: 0.5517 Epoch 94/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3594 - accuracy: 0.5770 - val_loss: 1.4051 - val_accuracy: 0.5637 Epoch 95/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3575 - accuracy: 0.5810 - val_loss: 1.4740 - val_accuracy: 0.5370 Epoch 96/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3640 - accuracy: 0.5769 - val_loss: 1.3730 - val_accuracy: 0.5810 Epoch 97/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3452 - accuracy: 0.5820 - val_loss: 1.4559 - val_accuracy: 0.5380 Epoch 98/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3538 - accuracy: 0.5801 - val_loss: 1.4350 - val_accuracy: 0.5480 Epoch 99/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3437 - accuracy: 0.5850 - val_loss: 1.3673 - val_accuracy: 0.5780 Epoch 100/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3308 - accuracy: 0.5844 - val_loss: 1.4135 - val_accuracy: 0.5523 Epoch 101/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3352 - accuracy: 0.5843 - val_loss: 1.3741 - val_accuracy: 0.5730 Epoch 102/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3293 - accuracy: 0.5875 - val_loss: 1.3620 - val_accuracy: 0.5767 Epoch 103/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3332 - accuracy: 0.5853 - val_loss: 1.3590 - val_accuracy: 0.5877 Epoch 104/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3268 - accuracy: 0.5856 - val_loss: 1.3910 - val_accuracy: 0.5603 Epoch 105/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3198 - accuracy: 0.5913 - val_loss: 1.3704 - val_accuracy: 0.5770 Epoch 106/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3166 - accuracy: 0.5908 - val_loss: 1.3862 - val_accuracy: 0.5690 Epoch 107/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3147 - accuracy: 0.5909 - val_loss: 1.3583 - val_accuracy: 0.5770 Epoch 108/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3164 - accuracy: 0.5906 - val_loss: 1.3342 - val_accuracy: 0.5917 Epoch 109/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3112 - accuracy: 0.5960 - val_loss: 1.3450 - val_accuracy: 0.5843 Epoch 110/150 283/283 [==============================] - 5s 18ms/step - loss: 1.2945 - accuracy: 0.5990 - val_loss: 1.3374 - val_accuracy: 0.5800 Epoch 111/150 283/283 [==============================] - 6s 18ms/step - loss: 1.3016 - accuracy: 0.5978 - val_loss: 1.3935 - val_accuracy: 0.5557 Epoch 112/150 283/283 [==============================] - 5s 18ms/step - loss: 1.2910 - accuracy: 0.5971 - val_loss: 1.3175 - val_accuracy: 0.5907 Epoch 113/150 283/283 [==============================] - 5s 17ms/step - loss: 1.2876 - accuracy: 0.6031 - val_loss: 1.3329 - val_accuracy: 0.5883 Epoch 114/150 283/283 [==============================] - 5s 17ms/step - loss: 1.2867 - accuracy: 0.5994 - val_loss: 1.5039 - val_accuracy: 0.5217 Epoch 115/150 283/283 [==============================] - 5s 18ms/step - loss: 1.2820 - accuracy: 0.6079 - val_loss: 1.3679 - val_accuracy: 0.5710 Epoch 116/150 283/283 [==============================] - 5s 18ms/step - loss: 1.2780 - accuracy: 0.6019 - val_loss: 1.3559 - val_accuracy: 0.5800 Epoch 117/150 283/283 [==============================] - 5s 18ms/step - loss: 1.2770 - accuracy: 0.6076 - val_loss: 1.3213 - val_accuracy: 0.5957 Epoch 118/150 283/283 [==============================] - 5s 18ms/step - loss: 1.2730 - accuracy: 0.6080 - val_loss: 1.3322 - val_accuracy: 0.5767 Epoch 119/150 283/283 [==============================] - 5s 17ms/step - loss: 1.2645 - accuracy: 0.6067 - val_loss: 1.3369 - val_accuracy: 0.5750 Epoch 120/150 283/283 [==============================] - 5s 18ms/step - loss: 1.2706 - accuracy: 0.6069 - val_loss: 1.4067 - val_accuracy: 0.5510 Epoch 121/150 283/283 [==============================] - 5s 17ms/step - loss: 1.2616 - accuracy: 0.6071 - val_loss: 1.3108 - val_accuracy: 0.6003 Epoch 122/150 283/283 [==============================] - 5s 18ms/step - loss: 1.2610 - accuracy: 0.6068 - val_loss: 1.2948 - val_accuracy: 0.5987 Epoch 123/150 283/283 [==============================] - 5s 18ms/step - loss: 1.2565 - accuracy: 0.6092 - val_loss: 1.3475 - val_accuracy: 0.5733 Epoch 124/150 283/283 [==============================] - 5s 18ms/step - loss: 1.2490 - accuracy: 0.6122 - val_loss: 1.3091 - val_accuracy: 0.6000 Epoch 125/150 283/283 [==============================] - 5s 17ms/step - loss: 1.2465 - accuracy: 0.6141 - val_loss: 1.3190 - val_accuracy: 0.5827 Epoch 126/150 283/283 [==============================] - 5s 18ms/step - loss: 1.2438 - accuracy: 0.6131 - val_loss: 1.3704 - val_accuracy: 0.5707 Epoch 127/150 283/283 [==============================] - 5s 18ms/step - loss: 1.2467 - accuracy: 0.6139 - val_loss: 1.3489 - val_accuracy: 0.5627 Epoch 128/150 283/283 [==============================] - 6s 18ms/step - loss: 1.2378 - accuracy: 0.6180 - val_loss: 1.3526 - val_accuracy: 0.5790 Epoch 129/150 283/283 [==============================] - 5s 18ms/step - loss: 1.2410 - accuracy: 0.6139 - val_loss: 1.2705 - val_accuracy: 0.6083 Epoch 130/150 283/283 [==============================] - 6s 18ms/step - loss: 1.2297 - accuracy: 0.6171 - val_loss: 1.2404 - val_accuracy: 0.6240 Epoch 131/150 283/283 [==============================] - 5s 18ms/step - loss: 1.2261 - accuracy: 0.6225 - val_loss: 1.3065 - val_accuracy: 0.5933 Epoch 132/150 283/283 [==============================] - 5s 18ms/step - loss: 1.2286 - accuracy: 0.6182 - val_loss: 1.3260 - val_accuracy: 0.5840 Epoch 133/150 283/283 [==============================] - 5s 17ms/step - loss: 1.2286 - accuracy: 0.6218 - val_loss: 1.2427 - val_accuracy: 0.6243 Epoch 134/150 283/283 [==============================] - 6s 18ms/step - loss: 1.2188 - accuracy: 0.6196 - val_loss: 1.2618 - val_accuracy: 0.6163 Epoch 135/150 283/283 [==============================] - 5s 18ms/step - loss: 1.2173 - accuracy: 0.6222 - val_loss: 1.2499 - val_accuracy: 0.6213 Epoch 136/150 283/283 [==============================] - 5s 18ms/step - loss: 1.2170 - accuracy: 0.6225 - val_loss: 1.2753 - val_accuracy: 0.6100 Epoch 137/150 283/283 [==============================] - 6s 18ms/step - loss: 1.2134 - accuracy: 0.6272 - val_loss: 1.2345 - val_accuracy: 0.6247 Epoch 138/150 283/283 [==============================] - 6s 19ms/step - loss: 1.2028 - accuracy: 0.6298 - val_loss: 1.3371 - val_accuracy: 0.5710 Epoch 139/150 283/283 [==============================] - 5s 17ms/step - loss: 1.2129 - accuracy: 0.6266 - val_loss: 1.2382 - val_accuracy: 0.6247 Epoch 140/150 283/283 [==============================] - 5s 18ms/step - loss: 1.2101 - accuracy: 0.6297 - val_loss: 1.2550 - val_accuracy: 0.6210 Epoch 141/150 283/283 [==============================] - 5s 17ms/step - loss: 1.1998 - accuracy: 0.6305 - val_loss: 1.2408 - val_accuracy: 0.6243 Epoch 142/150 283/283 [==============================] - 5s 17ms/step - loss: 1.1921 - accuracy: 0.6331 - val_loss: 1.2618 - val_accuracy: 0.6120 Epoch 143/150 283/283 [==============================] - 5s 17ms/step - loss: 1.1885 - accuracy: 0.6321 - val_loss: 1.2414 - val_accuracy: 0.6213 Epoch 144/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2065 - accuracy: 0.6313 - val_loss: 1.2485 - val_accuracy: 0.6143 Epoch 145/150 283/283 [==============================] - 5s 17ms/step - loss: 1.1859 - accuracy: 0.6326 - val_loss: 1.2172 - val_accuracy: 0.6343 Epoch 146/150 283/283 [==============================] - 5s 17ms/step - loss: 1.1878 - accuracy: 0.6313 - val_loss: 1.2465 - val_accuracy: 0.6173 Epoch 147/150 283/283 [==============================] - 5s 17ms/step - loss: 1.1842 - accuracy: 0.6368 - val_loss: 1.2771 - val_accuracy: 0.5970 Epoch 148/150 283/283 [==============================] - 5s 16ms/step - loss: 1.1839 - accuracy: 0.6355 - val_loss: 1.2203 - val_accuracy: 0.6243 Epoch 149/150 283/283 [==============================] - 5s 17ms/step - loss: 1.1832 - accuracy: 0.6368 - val_loss: 1.2591 - val_accuracy: 0.6160 Epoch 150/150 283/283 [==============================] - 5s 17ms/step - loss: 1.1706 - accuracy: 0.6450 - val_loss: 1.2177 - val_accuracy: 0.6267
history = baseline31desaturation_model.fit(res31desaturation_train_data, validation_data = res31desaturation_validation_data, epochs = 150, callbacks = earlyStopper)
cnnf.class_image(res31desaturation_full_data, class_names, cmap = "gray")
cnnf.model_metric_graph(history, baseline31desaturation_model, res31desaturation_validation_data, class_names= class_names,name = "baseline model for 31x31 image (greyscaled using desaturation method)")
Epoch 1/150 283/283 [==============================] - 5s 17ms/step - loss: 2.6026 - accuracy: 0.1380 - val_loss: 2.7011 - val_accuracy: 0.1083 Epoch 2/150 283/283 [==============================] - 5s 16ms/step - loss: 2.4533 - accuracy: 0.2009 - val_loss: 2.5655 - val_accuracy: 0.1530 Epoch 3/150 283/283 [==============================] - 5s 16ms/step - loss: 2.3437 - accuracy: 0.2509 - val_loss: 2.4339 - val_accuracy: 0.1813 Epoch 4/150 283/283 [==============================] - 5s 17ms/step - loss: 2.2562 - accuracy: 0.2829 - val_loss: 2.3488 - val_accuracy: 0.2227 Epoch 5/150 283/283 [==============================] - 5s 16ms/step - loss: 2.1928 - accuracy: 0.3139 - val_loss: 2.3247 - val_accuracy: 0.2287 Epoch 6/150 283/283 [==============================] - 5s 16ms/step - loss: 2.1545 - accuracy: 0.3293 - val_loss: 2.2466 - val_accuracy: 0.2720 Epoch 7/150 283/283 [==============================] - 5s 16ms/step - loss: 2.1193 - accuracy: 0.3404 - val_loss: 2.2192 - val_accuracy: 0.2880 Epoch 8/150 283/283 [==============================] - 5s 16ms/step - loss: 2.0893 - accuracy: 0.3552 - val_loss: 2.1857 - val_accuracy: 0.3023 Epoch 9/150 283/283 [==============================] - 5s 16ms/step - loss: 2.0647 - accuracy: 0.3656 - val_loss: 2.1673 - val_accuracy: 0.3177 Epoch 10/150 283/283 [==============================] - 5s 16ms/step - loss: 2.0353 - accuracy: 0.3756 - val_loss: 2.1951 - val_accuracy: 0.2970 Epoch 11/150 283/283 [==============================] - 5s 16ms/step - loss: 2.0041 - accuracy: 0.3901 - val_loss: 2.1045 - val_accuracy: 0.3343 Epoch 12/150 283/283 [==============================] - 5s 16ms/step - loss: 1.9805 - accuracy: 0.3964 - val_loss: 2.0605 - val_accuracy: 0.3390 Epoch 13/150 283/283 [==============================] - 5s 16ms/step - loss: 1.9566 - accuracy: 0.4026 - val_loss: 2.0713 - val_accuracy: 0.3363 Epoch 14/150 283/283 [==============================] - 5s 16ms/step - loss: 1.9341 - accuracy: 0.4087 - val_loss: 2.0426 - val_accuracy: 0.3437 Epoch 15/150 283/283 [==============================] - 5s 16ms/step - loss: 1.9124 - accuracy: 0.4174 - val_loss: 2.0489 - val_accuracy: 0.3523 Epoch 16/150 283/283 [==============================] - 5s 16ms/step - loss: 1.8878 - accuracy: 0.4219 - val_loss: 2.0230 - val_accuracy: 0.3553 Epoch 17/150 283/283 [==============================] - 5s 16ms/step - loss: 1.8698 - accuracy: 0.4322 - val_loss: 1.9688 - val_accuracy: 0.3703 Epoch 18/150 283/283 [==============================] - 5s 16ms/step - loss: 1.8569 - accuracy: 0.4308 - val_loss: 2.0094 - val_accuracy: 0.3647 Epoch 19/150 283/283 [==============================] - 5s 16ms/step - loss: 1.8357 - accuracy: 0.4412 - val_loss: 1.9415 - val_accuracy: 0.3810 Epoch 20/150 283/283 [==============================] - 5s 16ms/step - loss: 1.8217 - accuracy: 0.4456 - val_loss: 1.9150 - val_accuracy: 0.3820 Epoch 21/150 283/283 [==============================] - 5s 16ms/step - loss: 1.8157 - accuracy: 0.4445 - val_loss: 1.9028 - val_accuracy: 0.3963 Epoch 22/150 283/283 [==============================] - 5s 16ms/step - loss: 1.7920 - accuracy: 0.4527 - val_loss: 2.0439 - val_accuracy: 0.3610 Epoch 23/150 283/283 [==============================] - 5s 16ms/step - loss: 1.7797 - accuracy: 0.4582 - val_loss: 1.8999 - val_accuracy: 0.3983 Epoch 24/150 283/283 [==============================] - 5s 16ms/step - loss: 1.7636 - accuracy: 0.4569 - val_loss: 1.8940 - val_accuracy: 0.3930 Epoch 25/150 283/283 [==============================] - 5s 16ms/step - loss: 1.7519 - accuracy: 0.4638 - val_loss: 1.8371 - val_accuracy: 0.4130 Epoch 26/150 283/283 [==============================] - 5s 16ms/step - loss: 1.7378 - accuracy: 0.4700 - val_loss: 1.9151 - val_accuracy: 0.3820 Epoch 27/150 283/283 [==============================] - 5s 16ms/step - loss: 1.7268 - accuracy: 0.4709 - val_loss: 1.8069 - val_accuracy: 0.4230 Epoch 28/150 283/283 [==============================] - 5s 16ms/step - loss: 1.7170 - accuracy: 0.4753 - val_loss: 1.8362 - val_accuracy: 0.4173 Epoch 29/150 283/283 [==============================] - 5s 16ms/step - loss: 1.7030 - accuracy: 0.4787 - val_loss: 1.9090 - val_accuracy: 0.3910 Epoch 30/150 283/283 [==============================] - 5s 16ms/step - loss: 1.6920 - accuracy: 0.4783 - val_loss: 1.8621 - val_accuracy: 0.3943 Epoch 31/150 283/283 [==============================] - 5s 16ms/step - loss: 1.6859 - accuracy: 0.4837 - val_loss: 1.8061 - val_accuracy: 0.4153 Epoch 32/150 283/283 [==============================] - 5s 16ms/step - loss: 1.6731 - accuracy: 0.4854 - val_loss: 1.8779 - val_accuracy: 0.4097 Epoch 33/150 283/283 [==============================] - 5s 17ms/step - loss: 1.6557 - accuracy: 0.4900 - val_loss: 1.7849 - val_accuracy: 0.4140 Epoch 34/150 283/283 [==============================] - 5s 16ms/step - loss: 1.6480 - accuracy: 0.4960 - val_loss: 1.7890 - val_accuracy: 0.4263 Epoch 35/150 283/283 [==============================] - 5s 16ms/step - loss: 1.6363 - accuracy: 0.4994 - val_loss: 1.7354 - val_accuracy: 0.4417 Epoch 36/150 283/283 [==============================] - 5s 16ms/step - loss: 1.6270 - accuracy: 0.5029 - val_loss: 1.7762 - val_accuracy: 0.4267 Epoch 37/150 283/283 [==============================] - 5s 16ms/step - loss: 1.6201 - accuracy: 0.5027 - val_loss: 1.7330 - val_accuracy: 0.4363 Epoch 38/150 283/283 [==============================] - 5s 17ms/step - loss: 1.6088 - accuracy: 0.5062 - val_loss: 1.6878 - val_accuracy: 0.4510 Epoch 39/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5995 - accuracy: 0.5063 - val_loss: 1.7255 - val_accuracy: 0.4440 Epoch 40/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5904 - accuracy: 0.5085 - val_loss: 1.7136 - val_accuracy: 0.4417 Epoch 41/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5763 - accuracy: 0.5097 - val_loss: 1.6858 - val_accuracy: 0.4607 Epoch 42/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5713 - accuracy: 0.5103 - val_loss: 1.7008 - val_accuracy: 0.4567 Epoch 43/150 283/283 [==============================] - 5s 17ms/step - loss: 1.5692 - accuracy: 0.5184 - val_loss: 1.6497 - val_accuracy: 0.4677 Epoch 44/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5517 - accuracy: 0.5219 - val_loss: 1.6311 - val_accuracy: 0.4720 Epoch 45/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5527 - accuracy: 0.5246 - val_loss: 1.6732 - val_accuracy: 0.4510 Epoch 46/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5391 - accuracy: 0.5250 - val_loss: 1.6500 - val_accuracy: 0.4683 Epoch 47/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5321 - accuracy: 0.5233 - val_loss: 1.6368 - val_accuracy: 0.4667 Epoch 48/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5248 - accuracy: 0.5223 - val_loss: 1.6478 - val_accuracy: 0.4557 Epoch 49/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5117 - accuracy: 0.5302 - val_loss: 1.6446 - val_accuracy: 0.4613 Epoch 50/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5088 - accuracy: 0.5301 - val_loss: 1.6117 - val_accuracy: 0.4750 Epoch 51/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5091 - accuracy: 0.5308 - val_loss: 1.6862 - val_accuracy: 0.4490 Epoch 52/150 283/283 [==============================] - 5s 17ms/step - loss: 1.5041 - accuracy: 0.5279 - val_loss: 1.5831 - val_accuracy: 0.4850 Epoch 53/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4848 - accuracy: 0.5336 - val_loss: 1.5944 - val_accuracy: 0.4680 Epoch 54/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4867 - accuracy: 0.5360 - val_loss: 1.6392 - val_accuracy: 0.4557 Epoch 55/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4811 - accuracy: 0.5392 - val_loss: 1.5833 - val_accuracy: 0.4853 Epoch 56/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4691 - accuracy: 0.5413 - val_loss: 1.5928 - val_accuracy: 0.4830 Epoch 57/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4702 - accuracy: 0.5404 - val_loss: 1.5638 - val_accuracy: 0.4930 Epoch 58/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4524 - accuracy: 0.5445 - val_loss: 1.5344 - val_accuracy: 0.5030 Epoch 59/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4523 - accuracy: 0.5481 - val_loss: 1.5436 - val_accuracy: 0.4917 Epoch 60/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4481 - accuracy: 0.5456 - val_loss: 1.5158 - val_accuracy: 0.5060 Epoch 61/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4390 - accuracy: 0.5487 - val_loss: 1.5031 - val_accuracy: 0.5110 Epoch 62/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4427 - accuracy: 0.5477 - val_loss: 1.5658 - val_accuracy: 0.4977 Epoch 63/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4306 - accuracy: 0.5546 - val_loss: 1.5479 - val_accuracy: 0.4950 Epoch 64/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4250 - accuracy: 0.5562 - val_loss: 1.5128 - val_accuracy: 0.4997 Epoch 65/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4214 - accuracy: 0.5554 - val_loss: 1.5263 - val_accuracy: 0.4950 Epoch 66/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4192 - accuracy: 0.5551 - val_loss: 1.6112 - val_accuracy: 0.4750 Epoch 67/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4167 - accuracy: 0.5570 - val_loss: 1.5227 - val_accuracy: 0.5030 Epoch 68/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4054 - accuracy: 0.5587 - val_loss: 1.4898 - val_accuracy: 0.5097 Epoch 69/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4002 - accuracy: 0.5595 - val_loss: 1.4705 - val_accuracy: 0.5277 Epoch 70/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3988 - accuracy: 0.5598 - val_loss: 1.4556 - val_accuracy: 0.5297 Epoch 71/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3958 - accuracy: 0.5644 - val_loss: 1.4928 - val_accuracy: 0.5063 Epoch 72/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3868 - accuracy: 0.5642 - val_loss: 1.4783 - val_accuracy: 0.5167 Epoch 73/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3801 - accuracy: 0.5644 - val_loss: 1.4413 - val_accuracy: 0.5387 Epoch 74/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3788 - accuracy: 0.5690 - val_loss: 1.5009 - val_accuracy: 0.5113 Epoch 75/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3700 - accuracy: 0.5677 - val_loss: 1.4553 - val_accuracy: 0.5180 Epoch 76/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3626 - accuracy: 0.5709 - val_loss: 1.4611 - val_accuracy: 0.5157 Epoch 77/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3609 - accuracy: 0.5709 - val_loss: 1.4334 - val_accuracy: 0.5287 Epoch 78/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3540 - accuracy: 0.5790 - val_loss: 1.4423 - val_accuracy: 0.5213 Epoch 79/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3491 - accuracy: 0.5762 - val_loss: 1.4080 - val_accuracy: 0.5387 Epoch 80/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3497 - accuracy: 0.5795 - val_loss: 1.4128 - val_accuracy: 0.5397 Epoch 81/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3437 - accuracy: 0.5784 - val_loss: 1.4557 - val_accuracy: 0.5060 Epoch 82/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3326 - accuracy: 0.5783 - val_loss: 1.4148 - val_accuracy: 0.5353 Epoch 83/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3280 - accuracy: 0.5833 - val_loss: 1.4037 - val_accuracy: 0.5327 Epoch 84/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3254 - accuracy: 0.5834 - val_loss: 1.5457 - val_accuracy: 0.4803 Epoch 85/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3310 - accuracy: 0.5829 - val_loss: 1.4023 - val_accuracy: 0.5553 Epoch 86/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3216 - accuracy: 0.5864 - val_loss: 1.4342 - val_accuracy: 0.5297 Epoch 87/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3106 - accuracy: 0.5920 - val_loss: 1.3658 - val_accuracy: 0.5560 Epoch 88/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3153 - accuracy: 0.5875 - val_loss: 1.4152 - val_accuracy: 0.5387 Epoch 89/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3065 - accuracy: 0.5886 - val_loss: 1.3658 - val_accuracy: 0.5593 Epoch 90/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3019 - accuracy: 0.5903 - val_loss: 1.4092 - val_accuracy: 0.5297 Epoch 91/150 283/283 [==============================] - 5s 17ms/step - loss: 1.2973 - accuracy: 0.5963 - val_loss: 1.3488 - val_accuracy: 0.5513 Epoch 92/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2925 - accuracy: 0.5953 - val_loss: 1.3673 - val_accuracy: 0.5620 Epoch 93/150 283/283 [==============================] - 5s 17ms/step - loss: 1.2899 - accuracy: 0.5978 - val_loss: 1.4025 - val_accuracy: 0.5320 Epoch 94/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2867 - accuracy: 0.5948 - val_loss: 1.3505 - val_accuracy: 0.5617 Epoch 95/150 283/283 [==============================] - 5s 17ms/step - loss: 1.2843 - accuracy: 0.5963 - val_loss: 1.3670 - val_accuracy: 0.5460 Epoch 96/150 283/283 [==============================] - 5s 17ms/step - loss: 1.2785 - accuracy: 0.5980 - val_loss: 1.3541 - val_accuracy: 0.5537 Epoch 97/150 283/283 [==============================] - 5s 17ms/step - loss: 1.2700 - accuracy: 0.6027 - val_loss: 1.3366 - val_accuracy: 0.5670 Epoch 98/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2747 - accuracy: 0.6011 - val_loss: 1.3356 - val_accuracy: 0.5607 Epoch 99/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2640 - accuracy: 0.6038 - val_loss: 1.3253 - val_accuracy: 0.5630 Epoch 100/150 283/283 [==============================] - 5s 17ms/step - loss: 1.2626 - accuracy: 0.6053 - val_loss: 1.3126 - val_accuracy: 0.5737 Epoch 101/150 283/283 [==============================] - 5s 17ms/step - loss: 1.2594 - accuracy: 0.6054 - val_loss: 1.3368 - val_accuracy: 0.5660 Epoch 102/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2496 - accuracy: 0.6105 - val_loss: 1.3375 - val_accuracy: 0.5623 Epoch 103/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2495 - accuracy: 0.6107 - val_loss: 1.3909 - val_accuracy: 0.5373 Epoch 104/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2440 - accuracy: 0.6071 - val_loss: 1.3363 - val_accuracy: 0.5703 Epoch 105/150 283/283 [==============================] - 5s 17ms/step - loss: 1.2340 - accuracy: 0.6153 - val_loss: 1.2845 - val_accuracy: 0.5797 Epoch 106/150 283/283 [==============================] - 5s 17ms/step - loss: 1.2379 - accuracy: 0.6136 - val_loss: 1.3927 - val_accuracy: 0.5383 Epoch 107/150 283/283 [==============================] - 5s 17ms/step - loss: 1.2392 - accuracy: 0.6066 - val_loss: 1.2860 - val_accuracy: 0.5840 Epoch 108/150 283/283 [==============================] - 5s 17ms/step - loss: 1.2322 - accuracy: 0.6143 - val_loss: 1.2874 - val_accuracy: 0.5883 Epoch 109/150 283/283 [==============================] - 5s 17ms/step - loss: 1.2298 - accuracy: 0.6161 - val_loss: 1.2884 - val_accuracy: 0.5777 Epoch 110/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2222 - accuracy: 0.6170 - val_loss: 1.2650 - val_accuracy: 0.6000 Epoch 111/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2106 - accuracy: 0.6247 - val_loss: 1.2652 - val_accuracy: 0.5950 Epoch 112/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2155 - accuracy: 0.6198 - val_loss: 1.2752 - val_accuracy: 0.5870 Epoch 113/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2141 - accuracy: 0.6180 - val_loss: 1.2787 - val_accuracy: 0.5770 Epoch 114/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2108 - accuracy: 0.6221 - val_loss: 1.2637 - val_accuracy: 0.5997 Epoch 115/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2049 - accuracy: 0.6223 - val_loss: 1.2640 - val_accuracy: 0.5820 Epoch 116/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2102 - accuracy: 0.6217 - val_loss: 1.2800 - val_accuracy: 0.5747 Epoch 117/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2085 - accuracy: 0.6243 - val_loss: 1.3009 - val_accuracy: 0.5680 Epoch 118/150 283/283 [==============================] - 5s 16ms/step - loss: 1.1979 - accuracy: 0.6232 - val_loss: 1.2458 - val_accuracy: 0.5970 Epoch 119/150 283/283 [==============================] - 5s 16ms/step - loss: 1.1958 - accuracy: 0.6241 - val_loss: 1.3023 - val_accuracy: 0.5700 Epoch 120/150 283/283 [==============================] - 5s 16ms/step - loss: 1.1999 - accuracy: 0.6265 - val_loss: 1.2232 - val_accuracy: 0.6100 Epoch 121/150 283/283 [==============================] - 5s 16ms/step - loss: 1.1875 - accuracy: 0.6316 - val_loss: 1.2456 - val_accuracy: 0.5963 Epoch 122/150 283/283 [==============================] - 5s 16ms/step - loss: 1.1891 - accuracy: 0.6299 - val_loss: 1.2306 - val_accuracy: 0.6007 Epoch 123/150 283/283 [==============================] - 5s 16ms/step - loss: 1.1784 - accuracy: 0.6333 - val_loss: 1.2213 - val_accuracy: 0.6090 Epoch 124/150 283/283 [==============================] - 5s 16ms/step - loss: 1.1751 - accuracy: 0.6311 - val_loss: 1.2090 - val_accuracy: 0.6110 Epoch 125/150 283/283 [==============================] - 5s 16ms/step - loss: 1.1758 - accuracy: 0.6306 - val_loss: 1.2977 - val_accuracy: 0.5780 Epoch 126/150 283/283 [==============================] - 5s 16ms/step - loss: 1.1671 - accuracy: 0.6349 - val_loss: 1.2071 - val_accuracy: 0.6110 Epoch 127/150 283/283 [==============================] - 5s 16ms/step - loss: 1.1667 - accuracy: 0.6324 - val_loss: 1.2296 - val_accuracy: 0.5990 Epoch 128/150 283/283 [==============================] - 5s 16ms/step - loss: 1.1616 - accuracy: 0.6374 - val_loss: 1.2524 - val_accuracy: 0.5927 Epoch 129/150 283/283 [==============================] - 5s 16ms/step - loss: 1.1640 - accuracy: 0.6368 - val_loss: 1.2445 - val_accuracy: 0.5973 Epoch 130/150 283/283 [==============================] - 5s 16ms/step - loss: 1.1557 - accuracy: 0.6334 - val_loss: 1.1929 - val_accuracy: 0.6077 Epoch 131/150 283/283 [==============================] - 5s 17ms/step - loss: 1.1567 - accuracy: 0.6336 - val_loss: 1.2147 - val_accuracy: 0.6137 Epoch 132/150 283/283 [==============================] - 5s 17ms/step - loss: 1.1539 - accuracy: 0.6398 - val_loss: 1.3392 - val_accuracy: 0.5600 Epoch 133/150 283/283 [==============================] - 5s 17ms/step - loss: 1.1595 - accuracy: 0.6376 - val_loss: 1.2181 - val_accuracy: 0.6097 Epoch 134/150 283/283 [==============================] - 5s 17ms/step - loss: 1.1444 - accuracy: 0.6420 - val_loss: 1.2162 - val_accuracy: 0.6093 Epoch 135/150 283/283 [==============================] - 5s 16ms/step - loss: 1.1410 - accuracy: 0.6427 - val_loss: 1.1900 - val_accuracy: 0.6063 Epoch 136/150 283/283 [==============================] - 5s 16ms/step - loss: 1.1478 - accuracy: 0.6383 - val_loss: 1.1700 - val_accuracy: 0.6253 Epoch 137/150 283/283 [==============================] - 5s 16ms/step - loss: 1.1338 - accuracy: 0.6443 - val_loss: 1.1792 - val_accuracy: 0.6117 Epoch 138/150 283/283 [==============================] - 5s 17ms/step - loss: 1.1363 - accuracy: 0.6443 - val_loss: 1.1894 - val_accuracy: 0.6140 Epoch 139/150 283/283 [==============================] - 5s 16ms/step - loss: 1.1398 - accuracy: 0.6452 - val_loss: 1.1522 - val_accuracy: 0.6310 Epoch 140/150 283/283 [==============================] - 5s 16ms/step - loss: 1.1262 - accuracy: 0.6478 - val_loss: 1.1833 - val_accuracy: 0.6210 Epoch 141/150 283/283 [==============================] - 5s 17ms/step - loss: 1.1231 - accuracy: 0.6482 - val_loss: 1.3564 - val_accuracy: 0.5613 Epoch 142/150 283/283 [==============================] - 5s 17ms/step - loss: 1.1277 - accuracy: 0.6432 - val_loss: 1.2070 - val_accuracy: 0.6023 Epoch 143/150 283/283 [==============================] - 5s 16ms/step - loss: 1.1258 - accuracy: 0.6421 - val_loss: 1.1453 - val_accuracy: 0.6393 Epoch 144/150 283/283 [==============================] - 5s 17ms/step - loss: 1.1238 - accuracy: 0.6459 - val_loss: 1.1655 - val_accuracy: 0.6227 Epoch 145/150 283/283 [==============================] - 5s 16ms/step - loss: 1.1130 - accuracy: 0.6481 - val_loss: 1.1570 - val_accuracy: 0.6240 Epoch 146/150 283/283 [==============================] - 5s 16ms/step - loss: 1.1153 - accuracy: 0.6525 - val_loss: 1.1670 - val_accuracy: 0.6183 Epoch 147/150 283/283 [==============================] - 5s 16ms/step - loss: 1.1113 - accuracy: 0.6498 - val_loss: 1.1578 - val_accuracy: 0.6290 Epoch 148/150 283/283 [==============================] - 5s 16ms/step - loss: 1.1086 - accuracy: 0.6518 - val_loss: 1.1529 - val_accuracy: 0.6353 Epoch 149/150 283/283 [==============================] - 5s 17ms/step - loss: 1.1076 - accuracy: 0.6540 - val_loss: 1.1377 - val_accuracy: 0.6403 Epoch 150/150 283/283 [==============================] - 5s 17ms/step - loss: 1.1001 - accuracy: 0.6524 - val_loss: 1.1963 - val_accuracy: 0.6060
history = baseline128_model.fit(res128greyscale_train_data, validation_data = res128greyscale_validation_data, epochs = 150, callbacks = earlyStopper)
cnnf.class_image(res128greyscale_full_data, class_names, cmap = "gray")
cnnf.model_metric_graph(history, baseline128_model, res128greyscale_validation_data, name = "baseline model for 128x128 image (normally greyscaled)", class_names = class_names)
Epoch 1/150 283/283 [==============================] - 6s 18ms/step - loss: 2.6408 - accuracy: 0.1078 - val_loss: 2.7554 - val_accuracy: 0.1153 Epoch 2/150 283/283 [==============================] - 5s 18ms/step - loss: 2.5569 - accuracy: 0.1781 - val_loss: 2.6903 - val_accuracy: 0.1213 Epoch 3/150 283/283 [==============================] - 5s 18ms/step - loss: 2.4801 - accuracy: 0.2049 - val_loss: 2.6167 - val_accuracy: 0.1713 Epoch 4/150 283/283 [==============================] - 6s 18ms/step - loss: 2.4137 - accuracy: 0.2424 - val_loss: 2.5369 - val_accuracy: 0.1783 Epoch 5/150 283/283 [==============================] - 6s 20ms/step - loss: 2.3513 - accuracy: 0.2604 - val_loss: 2.5079 - val_accuracy: 0.1960 Epoch 6/150 283/283 [==============================] - 6s 18ms/step - loss: 2.2922 - accuracy: 0.2734 - val_loss: 2.4538 - val_accuracy: 0.2010 Epoch 7/150 283/283 [==============================] - 6s 18ms/step - loss: 2.2458 - accuracy: 0.2854 - val_loss: 2.4360 - val_accuracy: 0.1957 Epoch 8/150 283/283 [==============================] - 6s 19ms/step - loss: 2.2075 - accuracy: 0.2951 - val_loss: 2.4675 - val_accuracy: 0.1993 Epoch 9/150 283/283 [==============================] - 6s 19ms/step - loss: 2.1802 - accuracy: 0.2985 - val_loss: 2.2598 - val_accuracy: 0.2677 Epoch 10/150 283/283 [==============================] - 5s 18ms/step - loss: 2.1506 - accuracy: 0.3106 - val_loss: 2.2515 - val_accuracy: 0.2597 Epoch 11/150 283/283 [==============================] - 6s 18ms/step - loss: 2.1289 - accuracy: 0.3199 - val_loss: 2.2675 - val_accuracy: 0.2517 Epoch 12/150 283/283 [==============================] - 6s 19ms/step - loss: 2.1175 - accuracy: 0.3219 - val_loss: 2.2753 - val_accuracy: 0.2287 Epoch 13/150 283/283 [==============================] - 5s 18ms/step - loss: 2.0991 - accuracy: 0.3242 - val_loss: 2.1941 - val_accuracy: 0.2947 Epoch 14/150 283/283 [==============================] - 5s 17ms/step - loss: 2.0916 - accuracy: 0.3351 - val_loss: 2.1938 - val_accuracy: 0.2753 Epoch 15/150 283/283 [==============================] - 6s 19ms/step - loss: 2.0719 - accuracy: 0.3394 - val_loss: 2.1759 - val_accuracy: 0.2850 Epoch 16/150 283/283 [==============================] - 5s 18ms/step - loss: 2.0613 - accuracy: 0.3454 - val_loss: 2.1800 - val_accuracy: 0.2897 Epoch 17/150 283/283 [==============================] - 5s 18ms/step - loss: 2.0495 - accuracy: 0.3522 - val_loss: 2.1498 - val_accuracy: 0.2917 Epoch 18/150 283/283 [==============================] - 6s 19ms/step - loss: 2.0433 - accuracy: 0.3502 - val_loss: 2.1099 - val_accuracy: 0.3187 Epoch 19/150 283/283 [==============================] - 6s 19ms/step - loss: 2.0276 - accuracy: 0.3595 - val_loss: 2.0965 - val_accuracy: 0.3213 Epoch 20/150 283/283 [==============================] - 6s 19ms/step - loss: 2.0223 - accuracy: 0.3636 - val_loss: 2.0788 - val_accuracy: 0.3393 Epoch 21/150 283/283 [==============================] - 5s 18ms/step - loss: 2.0046 - accuracy: 0.3720 - val_loss: 2.1813 - val_accuracy: 0.2923 Epoch 22/150 283/283 [==============================] - 5s 18ms/step - loss: 1.9947 - accuracy: 0.3761 - val_loss: 2.0978 - val_accuracy: 0.3210 Epoch 23/150 283/283 [==============================] - 5s 18ms/step - loss: 1.9876 - accuracy: 0.3726 - val_loss: 2.1072 - val_accuracy: 0.3157 Epoch 24/150 283/283 [==============================] - 6s 18ms/step - loss: 1.9760 - accuracy: 0.3881 - val_loss: 2.0860 - val_accuracy: 0.3280 Epoch 25/150 283/283 [==============================] - 6s 19ms/step - loss: 1.9658 - accuracy: 0.3875 - val_loss: 2.0313 - val_accuracy: 0.3503 Epoch 26/150 283/283 [==============================] - 5s 18ms/step - loss: 1.9505 - accuracy: 0.3953 - val_loss: 2.0213 - val_accuracy: 0.3583 Epoch 27/150 283/283 [==============================] - 5s 18ms/step - loss: 1.9359 - accuracy: 0.4000 - val_loss: 2.0214 - val_accuracy: 0.3450 Epoch 28/150 283/283 [==============================] - 5s 17ms/step - loss: 1.9220 - accuracy: 0.3996 - val_loss: 1.9815 - val_accuracy: 0.3650 Epoch 29/150 283/283 [==============================] - 5s 18ms/step - loss: 1.9169 - accuracy: 0.4023 - val_loss: 1.9588 - val_accuracy: 0.3870 Epoch 30/150 283/283 [==============================] - 5s 17ms/step - loss: 1.9066 - accuracy: 0.4068 - val_loss: 1.9691 - val_accuracy: 0.3713 Epoch 31/150 283/283 [==============================] - 6s 19ms/step - loss: 1.8877 - accuracy: 0.4138 - val_loss: 1.9966 - val_accuracy: 0.3553 Epoch 32/150 283/283 [==============================] - 6s 19ms/step - loss: 1.8709 - accuracy: 0.4276 - val_loss: 1.9611 - val_accuracy: 0.3627 Epoch 33/150 283/283 [==============================] - 6s 18ms/step - loss: 1.8757 - accuracy: 0.4202 - val_loss: 1.9387 - val_accuracy: 0.3720 Epoch 34/150 283/283 [==============================] - 6s 18ms/step - loss: 1.8519 - accuracy: 0.4322 - val_loss: 1.9375 - val_accuracy: 0.3963 Epoch 35/150 283/283 [==============================] - 6s 19ms/step - loss: 1.8411 - accuracy: 0.4332 - val_loss: 1.9361 - val_accuracy: 0.3873 Epoch 36/150 283/283 [==============================] - 6s 19ms/step - loss: 1.8367 - accuracy: 0.4384 - val_loss: 1.9093 - val_accuracy: 0.3997 Epoch 37/150 283/283 [==============================] - 5s 18ms/step - loss: 1.8243 - accuracy: 0.4405 - val_loss: 1.9763 - val_accuracy: 0.3630 Epoch 38/150 283/283 [==============================] - 5s 17ms/step - loss: 1.8135 - accuracy: 0.4463 - val_loss: 1.9547 - val_accuracy: 0.3853 Epoch 39/150 283/283 [==============================] - 5s 18ms/step - loss: 1.8038 - accuracy: 0.4486 - val_loss: 1.9395 - val_accuracy: 0.3920 Epoch 40/150 283/283 [==============================] - 5s 18ms/step - loss: 1.7931 - accuracy: 0.4540 - val_loss: 1.8492 - val_accuracy: 0.4483 Epoch 41/150 283/283 [==============================] - 5s 18ms/step - loss: 1.7885 - accuracy: 0.4520 - val_loss: 1.8518 - val_accuracy: 0.4503 Epoch 42/150 283/283 [==============================] - 5s 18ms/step - loss: 1.7734 - accuracy: 0.4611 - val_loss: 1.8888 - val_accuracy: 0.4187 Epoch 43/150 283/283 [==============================] - 5s 18ms/step - loss: 1.7679 - accuracy: 0.4658 - val_loss: 1.8294 - val_accuracy: 0.4370 Epoch 44/150 283/283 [==============================] - 5s 17ms/step - loss: 1.7605 - accuracy: 0.4656 - val_loss: 1.8837 - val_accuracy: 0.4140 Epoch 45/150 283/283 [==============================] - 5s 18ms/step - loss: 1.7411 - accuracy: 0.4751 - val_loss: 1.8037 - val_accuracy: 0.4680 Epoch 46/150 283/283 [==============================] - 5s 18ms/step - loss: 1.7360 - accuracy: 0.4771 - val_loss: 1.8091 - val_accuracy: 0.4377 Epoch 47/150 283/283 [==============================] - 5s 18ms/step - loss: 1.7312 - accuracy: 0.4765 - val_loss: 1.7957 - val_accuracy: 0.4687 Epoch 48/150 283/283 [==============================] - 6s 18ms/step - loss: 1.7172 - accuracy: 0.4833 - val_loss: 1.7650 - val_accuracy: 0.4840 Epoch 49/150 283/283 [==============================] - 5s 18ms/step - loss: 1.7136 - accuracy: 0.4858 - val_loss: 1.8339 - val_accuracy: 0.4307 Epoch 50/150 283/283 [==============================] - 5s 18ms/step - loss: 1.7037 - accuracy: 0.4863 - val_loss: 1.7736 - val_accuracy: 0.4537 Epoch 51/150 283/283 [==============================] - 5s 17ms/step - loss: 1.6956 - accuracy: 0.4920 - val_loss: 1.8181 - val_accuracy: 0.4503 Epoch 52/150 283/283 [==============================] - 5s 17ms/step - loss: 1.6832 - accuracy: 0.4916 - val_loss: 1.8062 - val_accuracy: 0.4573 Epoch 53/150 283/283 [==============================] - 5s 17ms/step - loss: 1.6740 - accuracy: 0.5009 - val_loss: 1.7758 - val_accuracy: 0.4727 Epoch 54/150 283/283 [==============================] - 5s 17ms/step - loss: 1.6680 - accuracy: 0.4984 - val_loss: 1.7388 - val_accuracy: 0.4780 Epoch 55/150 283/283 [==============================] - 5s 17ms/step - loss: 1.6595 - accuracy: 0.4998 - val_loss: 1.7511 - val_accuracy: 0.4533 Epoch 56/150 283/283 [==============================] - 5s 17ms/step - loss: 1.6494 - accuracy: 0.5045 - val_loss: 1.7124 - val_accuracy: 0.4857 Epoch 57/150 283/283 [==============================] - 5s 17ms/step - loss: 1.6423 - accuracy: 0.5089 - val_loss: 1.7220 - val_accuracy: 0.4927 Epoch 58/150 283/283 [==============================] - 6s 19ms/step - loss: 1.6353 - accuracy: 0.5104 - val_loss: 1.6820 - val_accuracy: 0.4983 Epoch 59/150 283/283 [==============================] - 5s 17ms/step - loss: 1.6273 - accuracy: 0.5134 - val_loss: 1.6962 - val_accuracy: 0.4887 Epoch 60/150 283/283 [==============================] - 5s 17ms/step - loss: 1.6208 - accuracy: 0.5160 - val_loss: 1.7033 - val_accuracy: 0.4930 Epoch 61/150 283/283 [==============================] - 5s 17ms/step - loss: 1.6095 - accuracy: 0.5148 - val_loss: 1.6676 - val_accuracy: 0.5087 Epoch 62/150 283/283 [==============================] - 5s 17ms/step - loss: 1.6030 - accuracy: 0.5294 - val_loss: 1.7004 - val_accuracy: 0.4897 Epoch 63/150 283/283 [==============================] - 5s 17ms/step - loss: 1.6032 - accuracy: 0.5257 - val_loss: 1.6837 - val_accuracy: 0.4997 Epoch 64/150 283/283 [==============================] - 5s 17ms/step - loss: 1.5950 - accuracy: 0.5286 - val_loss: 1.7179 - val_accuracy: 0.4693 Epoch 65/150 283/283 [==============================] - 5s 18ms/step - loss: 1.5905 - accuracy: 0.5243 - val_loss: 1.6618 - val_accuracy: 0.5150 Epoch 66/150 283/283 [==============================] - 5s 18ms/step - loss: 1.5827 - accuracy: 0.5270 - val_loss: 1.6794 - val_accuracy: 0.5080 Epoch 67/150 283/283 [==============================] - 6s 20ms/step - loss: 1.5781 - accuracy: 0.5274 - val_loss: 1.6159 - val_accuracy: 0.5247 Epoch 68/150 283/283 [==============================] - 6s 19ms/step - loss: 1.5667 - accuracy: 0.5309 - val_loss: 1.6944 - val_accuracy: 0.4843 Epoch 69/150 283/283 [==============================] - 6s 21ms/step - loss: 1.5549 - accuracy: 0.5370 - val_loss: 1.6174 - val_accuracy: 0.5193 Epoch 70/150 283/283 [==============================] - 6s 20ms/step - loss: 1.5581 - accuracy: 0.5359 - val_loss: 1.6127 - val_accuracy: 0.5260 Epoch 71/150 283/283 [==============================] - 6s 20ms/step - loss: 1.5469 - accuracy: 0.5422 - val_loss: 1.6820 - val_accuracy: 0.4877 Epoch 72/150 283/283 [==============================] - 6s 20ms/step - loss: 1.5436 - accuracy: 0.5445 - val_loss: 1.6410 - val_accuracy: 0.5137 Epoch 73/150 283/283 [==============================] - 6s 18ms/step - loss: 1.5414 - accuracy: 0.5407 - val_loss: 1.6027 - val_accuracy: 0.5353 Epoch 74/150 283/283 [==============================] - 5s 17ms/step - loss: 1.5367 - accuracy: 0.5434 - val_loss: 1.6067 - val_accuracy: 0.5283 Epoch 75/150 283/283 [==============================] - 5s 17ms/step - loss: 1.5281 - accuracy: 0.5423 - val_loss: 1.6204 - val_accuracy: 0.5190 Epoch 76/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5235 - accuracy: 0.5449 - val_loss: 1.7209 - val_accuracy: 0.4730 Epoch 77/150 283/283 [==============================] - 5s 16ms/step - loss: 1.5173 - accuracy: 0.5477 - val_loss: 1.6750 - val_accuracy: 0.4800 Epoch 78/150 283/283 [==============================] - 5s 17ms/step - loss: 1.5107 - accuracy: 0.5471 - val_loss: 1.5528 - val_accuracy: 0.5357 Epoch 79/150 283/283 [==============================] - 5s 17ms/step - loss: 1.5113 - accuracy: 0.5487 - val_loss: 1.5890 - val_accuracy: 0.5280 Epoch 80/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4994 - accuracy: 0.5496 - val_loss: 1.5359 - val_accuracy: 0.5483 Epoch 81/150 283/283 [==============================] - 5s 17ms/step - loss: 1.5042 - accuracy: 0.5512 - val_loss: 1.5724 - val_accuracy: 0.5277 Epoch 82/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4933 - accuracy: 0.5527 - val_loss: 1.6636 - val_accuracy: 0.4853 Epoch 83/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4902 - accuracy: 0.5562 - val_loss: 1.5724 - val_accuracy: 0.5290 Epoch 84/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4823 - accuracy: 0.5570 - val_loss: 1.5807 - val_accuracy: 0.5267 Epoch 85/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4789 - accuracy: 0.5577 - val_loss: 1.6333 - val_accuracy: 0.5057 Epoch 86/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4813 - accuracy: 0.5567 - val_loss: 1.6051 - val_accuracy: 0.5147 Epoch 87/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4701 - accuracy: 0.5616 - val_loss: 1.5663 - val_accuracy: 0.5360 Epoch 88/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4765 - accuracy: 0.5555 - val_loss: 1.5226 - val_accuracy: 0.5450 Epoch 89/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4666 - accuracy: 0.5573 - val_loss: 1.5681 - val_accuracy: 0.5257 Epoch 90/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4581 - accuracy: 0.5604 - val_loss: 1.5216 - val_accuracy: 0.5390 Epoch 91/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4554 - accuracy: 0.5606 - val_loss: 1.5369 - val_accuracy: 0.5327 Epoch 92/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4616 - accuracy: 0.5601 - val_loss: 1.5054 - val_accuracy: 0.5630 Epoch 93/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4515 - accuracy: 0.5654 - val_loss: 1.5471 - val_accuracy: 0.5260 Epoch 94/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4472 - accuracy: 0.5666 - val_loss: 1.5122 - val_accuracy: 0.5550 Epoch 95/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4445 - accuracy: 0.5632 - val_loss: 1.6064 - val_accuracy: 0.5150 Epoch 96/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4446 - accuracy: 0.5672 - val_loss: 1.5373 - val_accuracy: 0.5357 Epoch 97/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4387 - accuracy: 0.5638 - val_loss: 1.5082 - val_accuracy: 0.5483 Epoch 98/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4301 - accuracy: 0.5724 - val_loss: 1.5081 - val_accuracy: 0.5513 Epoch 99/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4323 - accuracy: 0.5654 - val_loss: 1.5292 - val_accuracy: 0.5363 Epoch 100/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4253 - accuracy: 0.5680 - val_loss: 1.5145 - val_accuracy: 0.5387 Epoch 101/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4227 - accuracy: 0.5740 - val_loss: 1.4716 - val_accuracy: 0.5467 Epoch 102/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4163 - accuracy: 0.5722 - val_loss: 1.4802 - val_accuracy: 0.5573 Epoch 103/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4150 - accuracy: 0.5701 - val_loss: 1.4681 - val_accuracy: 0.5537 Epoch 104/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4183 - accuracy: 0.5760 - val_loss: 1.4959 - val_accuracy: 0.5310 Epoch 105/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4163 - accuracy: 0.5743 - val_loss: 1.5400 - val_accuracy: 0.5423 Epoch 106/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4133 - accuracy: 0.5735 - val_loss: 1.4785 - val_accuracy: 0.5500 Epoch 107/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4026 - accuracy: 0.5740 - val_loss: 1.4429 - val_accuracy: 0.5727 Epoch 108/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4078 - accuracy: 0.5709 - val_loss: 1.4698 - val_accuracy: 0.5563 Epoch 109/150 283/283 [==============================] - 5s 16ms/step - loss: 1.4071 - accuracy: 0.5737 - val_loss: 1.4532 - val_accuracy: 0.5620 Epoch 110/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3934 - accuracy: 0.5805 - val_loss: 1.4662 - val_accuracy: 0.5503 Epoch 111/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3898 - accuracy: 0.5842 - val_loss: 1.4966 - val_accuracy: 0.5387 Epoch 112/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3862 - accuracy: 0.5807 - val_loss: 1.4551 - val_accuracy: 0.5667 Epoch 113/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3894 - accuracy: 0.5810 - val_loss: 1.4884 - val_accuracy: 0.5450 Epoch 114/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3911 - accuracy: 0.5780 - val_loss: 1.4122 - val_accuracy: 0.5780 Epoch 115/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3874 - accuracy: 0.5778 - val_loss: 1.4835 - val_accuracy: 0.5647 Epoch 116/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3852 - accuracy: 0.5780 - val_loss: 1.5265 - val_accuracy: 0.5337 Epoch 117/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3811 - accuracy: 0.5823 - val_loss: 1.4247 - val_accuracy: 0.5653 Epoch 118/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3750 - accuracy: 0.5873 - val_loss: 1.4476 - val_accuracy: 0.5693 Epoch 119/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3636 - accuracy: 0.5848 - val_loss: 1.4334 - val_accuracy: 0.5603 Epoch 120/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3708 - accuracy: 0.5862 - val_loss: 1.4215 - val_accuracy: 0.5763 Epoch 121/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3659 - accuracy: 0.5850 - val_loss: 1.4714 - val_accuracy: 0.5637 Epoch 122/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3658 - accuracy: 0.5848 - val_loss: 1.3973 - val_accuracy: 0.5843 Epoch 123/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3586 - accuracy: 0.5865 - val_loss: 1.5174 - val_accuracy: 0.5393 Epoch 124/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3564 - accuracy: 0.5876 - val_loss: 1.5561 - val_accuracy: 0.5093 Epoch 125/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3507 - accuracy: 0.5908 - val_loss: 1.4854 - val_accuracy: 0.5457 Epoch 126/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3561 - accuracy: 0.5846 - val_loss: 1.4535 - val_accuracy: 0.5457 Epoch 127/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3501 - accuracy: 0.5905 - val_loss: 1.4564 - val_accuracy: 0.5510 Epoch 128/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3544 - accuracy: 0.5904 - val_loss: 1.4507 - val_accuracy: 0.5640 Epoch 129/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3478 - accuracy: 0.5955 - val_loss: 1.4210 - val_accuracy: 0.5647 Epoch 130/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3449 - accuracy: 0.5919 - val_loss: 1.4228 - val_accuracy: 0.5617 Epoch 131/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3384 - accuracy: 0.5920 - val_loss: 1.3924 - val_accuracy: 0.5697 Epoch 132/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3481 - accuracy: 0.5897 - val_loss: 1.3883 - val_accuracy: 0.5863 Epoch 133/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3401 - accuracy: 0.5930 - val_loss: 1.4992 - val_accuracy: 0.5330 Epoch 134/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3326 - accuracy: 0.5935 - val_loss: 1.4985 - val_accuracy: 0.5307 Epoch 135/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3294 - accuracy: 0.5925 - val_loss: 1.4420 - val_accuracy: 0.5633 Epoch 136/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3315 - accuracy: 0.5883 - val_loss: 1.3997 - val_accuracy: 0.5750 Epoch 137/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3298 - accuracy: 0.5980 - val_loss: 1.4304 - val_accuracy: 0.5723 Epoch 138/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3297 - accuracy: 0.5943 - val_loss: 1.3814 - val_accuracy: 0.5883 Epoch 139/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3275 - accuracy: 0.5967 - val_loss: 1.3632 - val_accuracy: 0.5757 Epoch 140/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3269 - accuracy: 0.5978 - val_loss: 1.3908 - val_accuracy: 0.5740 Epoch 141/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3182 - accuracy: 0.5981 - val_loss: 1.3574 - val_accuracy: 0.5903 Epoch 142/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3153 - accuracy: 0.5985 - val_loss: 1.4173 - val_accuracy: 0.5690 Epoch 143/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3109 - accuracy: 0.5978 - val_loss: 1.3619 - val_accuracy: 0.5797 Epoch 144/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3101 - accuracy: 0.5998 - val_loss: 1.4027 - val_accuracy: 0.5720 Epoch 145/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3091 - accuracy: 0.5998 - val_loss: 1.3337 - val_accuracy: 0.5943 Epoch 146/150 283/283 [==============================] - 5s 16ms/step - loss: 1.3086 - accuracy: 0.5986 - val_loss: 1.3357 - val_accuracy: 0.5920 Epoch 147/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3051 - accuracy: 0.6018 - val_loss: 1.4908 - val_accuracy: 0.5477 Epoch 148/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3036 - accuracy: 0.6009 - val_loss: 1.3150 - val_accuracy: 0.6073 Epoch 149/150 283/283 [==============================] - 5s 16ms/step - loss: 1.2990 - accuracy: 0.6079 - val_loss: 1.4468 - val_accuracy: 0.5387 Epoch 150/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3008 - accuracy: 0.6058 - val_loss: 1.3631 - val_accuracy: 0.5850
baseline128greyscale_model = clone_model(baseline_model)
baseline128greyscale_model.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy'])
history = baseline128luminosity_model.fit(res128luminosity_train_data, validation_data = res128luminosity_validation_data, epochs = 150, callbacks = earlyStopper)
cnnf.class_image(res128luminosity_full_data, class_names, cmap = "gray")
cnnf.model_metric_graph(history, baseline128luminosity_model, res128luminosity_validation_data, name = "baseline model for 128x128 image (using luminosity method)", class_names = class_names)
Epoch 1/150 283/283 [==============================] - 5s 17ms/step - loss: 2.6276 - accuracy: 0.1135 - val_loss: 2.7736 - val_accuracy: 0.1150 Epoch 2/150 283/283 [==============================] - 5s 17ms/step - loss: 2.5313 - accuracy: 0.1917 - val_loss: 2.6425 - val_accuracy: 0.1773 Epoch 3/150 283/283 [==============================] - 5s 17ms/step - loss: 2.4202 - accuracy: 0.2336 - val_loss: 2.5386 - val_accuracy: 0.1793 Epoch 4/150 283/283 [==============================] - 5s 17ms/step - loss: 2.3292 - accuracy: 0.2621 - val_loss: 2.4175 - val_accuracy: 0.2170 Epoch 5/150 283/283 [==============================] - 5s 18ms/step - loss: 2.2522 - accuracy: 0.2785 - val_loss: 2.3220 - val_accuracy: 0.2300 Epoch 6/150 283/283 [==============================] - 5s 17ms/step - loss: 2.2048 - accuracy: 0.2894 - val_loss: 2.2824 - val_accuracy: 0.2573 Epoch 7/150 283/283 [==============================] - 5s 18ms/step - loss: 2.1712 - accuracy: 0.3044 - val_loss: 2.3325 - val_accuracy: 0.2110 Epoch 8/150 283/283 [==============================] - 5s 18ms/step - loss: 2.1456 - accuracy: 0.3137 - val_loss: 2.2341 - val_accuracy: 0.2587 Epoch 9/150 283/283 [==============================] - 5s 17ms/step - loss: 2.1242 - accuracy: 0.3206 - val_loss: 2.2262 - val_accuracy: 0.2677 Epoch 10/150 283/283 [==============================] - 5s 18ms/step - loss: 2.1110 - accuracy: 0.3270 - val_loss: 2.1865 - val_accuracy: 0.2830 Epoch 11/150 283/283 [==============================] - 5s 18ms/step - loss: 2.0892 - accuracy: 0.3311 - val_loss: 2.1513 - val_accuracy: 0.3097 Epoch 12/150 283/283 [==============================] - 5s 18ms/step - loss: 2.0762 - accuracy: 0.3386 - val_loss: 2.1308 - val_accuracy: 0.3183 Epoch 13/150 283/283 [==============================] - 5s 17ms/step - loss: 2.0585 - accuracy: 0.3487 - val_loss: 2.1853 - val_accuracy: 0.2903 Epoch 14/150 283/283 [==============================] - 5s 18ms/step - loss: 2.0493 - accuracy: 0.3539 - val_loss: 2.1824 - val_accuracy: 0.2803 Epoch 15/150 283/283 [==============================] - 5s 18ms/step - loss: 2.0341 - accuracy: 0.3576 - val_loss: 2.1780 - val_accuracy: 0.2913 Epoch 16/150 283/283 [==============================] - 5s 18ms/step - loss: 2.0240 - accuracy: 0.3614 - val_loss: 2.1852 - val_accuracy: 0.2783 Epoch 17/150 283/283 [==============================] - 5s 18ms/step - loss: 2.0026 - accuracy: 0.3706 - val_loss: 2.0520 - val_accuracy: 0.3560 Epoch 18/150 283/283 [==============================] - 5s 17ms/step - loss: 1.9892 - accuracy: 0.3741 - val_loss: 2.1072 - val_accuracy: 0.3090 Epoch 19/150 283/283 [==============================] - 5s 18ms/step - loss: 1.9737 - accuracy: 0.3789 - val_loss: 2.0387 - val_accuracy: 0.3503 Epoch 20/150 283/283 [==============================] - 6s 18ms/step - loss: 1.9636 - accuracy: 0.3816 - val_loss: 2.0070 - val_accuracy: 0.3780 Epoch 21/150 283/283 [==============================] - 5s 18ms/step - loss: 1.9433 - accuracy: 0.3914 - val_loss: 1.9794 - val_accuracy: 0.3897 Epoch 22/150 283/283 [==============================] - 5s 18ms/step - loss: 1.9280 - accuracy: 0.3939 - val_loss: 2.0382 - val_accuracy: 0.3553 Epoch 23/150 283/283 [==============================] - 5s 18ms/step - loss: 1.9141 - accuracy: 0.4041 - val_loss: 1.9950 - val_accuracy: 0.3673 Epoch 24/150 283/283 [==============================] - 6s 19ms/step - loss: 1.8903 - accuracy: 0.4107 - val_loss: 1.9797 - val_accuracy: 0.3850 Epoch 25/150 283/283 [==============================] - 6s 18ms/step - loss: 1.8816 - accuracy: 0.4071 - val_loss: 2.0751 - val_accuracy: 0.3353 Epoch 26/150 283/283 [==============================] - 6s 19ms/step - loss: 1.8627 - accuracy: 0.4177 - val_loss: 1.9136 - val_accuracy: 0.4127 Epoch 27/150 283/283 [==============================] - 6s 18ms/step - loss: 1.8436 - accuracy: 0.4289 - val_loss: 1.9300 - val_accuracy: 0.3997 Epoch 28/150 283/283 [==============================] - 5s 18ms/step - loss: 1.8327 - accuracy: 0.4387 - val_loss: 1.9118 - val_accuracy: 0.4077 Epoch 29/150 283/283 [==============================] - 6s 18ms/step - loss: 1.8131 - accuracy: 0.4426 - val_loss: 1.8724 - val_accuracy: 0.4420 Epoch 30/150 283/283 [==============================] - 5s 18ms/step - loss: 1.7999 - accuracy: 0.4500 - val_loss: 1.8532 - val_accuracy: 0.4473 Epoch 31/150 283/283 [==============================] - 6s 18ms/step - loss: 1.7837 - accuracy: 0.4536 - val_loss: 1.8855 - val_accuracy: 0.4160 Epoch 32/150 283/283 [==============================] - 5s 18ms/step - loss: 1.7685 - accuracy: 0.4584 - val_loss: 1.8719 - val_accuracy: 0.4043 Epoch 33/150 283/283 [==============================] - 5s 17ms/step - loss: 1.7571 - accuracy: 0.4652 - val_loss: 1.9498 - val_accuracy: 0.3860 Epoch 34/150 283/283 [==============================] - 5s 18ms/step - loss: 1.7378 - accuracy: 0.4708 - val_loss: 1.8155 - val_accuracy: 0.4523 Epoch 35/150 283/283 [==============================] - 6s 21ms/step - loss: 1.7361 - accuracy: 0.4677 - val_loss: 1.7904 - val_accuracy: 0.4587 Epoch 36/150 283/283 [==============================] - 6s 19ms/step - loss: 1.7173 - accuracy: 0.4788 - val_loss: 1.7946 - val_accuracy: 0.4577 Epoch 37/150 283/283 [==============================] - 6s 19ms/step - loss: 1.7122 - accuracy: 0.4833 - val_loss: 1.8564 - val_accuracy: 0.4253 Epoch 38/150 283/283 [==============================] - 6s 19ms/step - loss: 1.7061 - accuracy: 0.4842 - val_loss: 1.7850 - val_accuracy: 0.4727 Epoch 39/150 283/283 [==============================] - 6s 18ms/step - loss: 1.6926 - accuracy: 0.4909 - val_loss: 1.7868 - val_accuracy: 0.4580 Epoch 40/150 283/283 [==============================] - 5s 18ms/step - loss: 1.6857 - accuracy: 0.4942 - val_loss: 1.7460 - val_accuracy: 0.4823 Epoch 41/150 283/283 [==============================] - 6s 19ms/step - loss: 1.6729 - accuracy: 0.4940 - val_loss: 1.7757 - val_accuracy: 0.4693 Epoch 42/150 283/283 [==============================] - 6s 19ms/step - loss: 1.6597 - accuracy: 0.5019 - val_loss: 1.7433 - val_accuracy: 0.4717 Epoch 43/150 283/283 [==============================] - 6s 19ms/step - loss: 1.6536 - accuracy: 0.5020 - val_loss: 1.7035 - val_accuracy: 0.4857 Epoch 44/150 283/283 [==============================] - 6s 18ms/step - loss: 1.6484 - accuracy: 0.5039 - val_loss: 1.7168 - val_accuracy: 0.5000 Epoch 45/150 283/283 [==============================] - 6s 19ms/step - loss: 1.6363 - accuracy: 0.5113 - val_loss: 1.7126 - val_accuracy: 0.4830 Epoch 46/150 283/283 [==============================] - 5s 18ms/step - loss: 1.6333 - accuracy: 0.5141 - val_loss: 1.6959 - val_accuracy: 0.4870 Epoch 47/150 283/283 [==============================] - 5s 18ms/step - loss: 1.6327 - accuracy: 0.5091 - val_loss: 1.7462 - val_accuracy: 0.4737 Epoch 48/150 283/283 [==============================] - 5s 18ms/step - loss: 1.6158 - accuracy: 0.5168 - val_loss: 1.7156 - val_accuracy: 0.4853 Epoch 49/150 283/283 [==============================] - 6s 19ms/step - loss: 1.6082 - accuracy: 0.5172 - val_loss: 1.6732 - val_accuracy: 0.5067 Epoch 50/150 283/283 [==============================] - 6s 18ms/step - loss: 1.6031 - accuracy: 0.5230 - val_loss: 1.7524 - val_accuracy: 0.4600 Epoch 51/150 283/283 [==============================] - 6s 19ms/step - loss: 1.5923 - accuracy: 0.5246 - val_loss: 1.7314 - val_accuracy: 0.4790 Epoch 52/150 283/283 [==============================] - 6s 18ms/step - loss: 1.5878 - accuracy: 0.5281 - val_loss: 1.6394 - val_accuracy: 0.4953 Epoch 53/150 283/283 [==============================] - 6s 18ms/step - loss: 1.5822 - accuracy: 0.5302 - val_loss: 1.6910 - val_accuracy: 0.5067 Epoch 54/150 283/283 [==============================] - 5s 18ms/step - loss: 1.5841 - accuracy: 0.5219 - val_loss: 1.6668 - val_accuracy: 0.4977 Epoch 55/150 283/283 [==============================] - 6s 18ms/step - loss: 1.5745 - accuracy: 0.5325 - val_loss: 1.6421 - val_accuracy: 0.5227 Epoch 56/150 283/283 [==============================] - 5s 18ms/step - loss: 1.5603 - accuracy: 0.5353 - val_loss: 1.6487 - val_accuracy: 0.5067 Epoch 57/150 283/283 [==============================] - 6s 18ms/step - loss: 1.5534 - accuracy: 0.5382 - val_loss: 1.6352 - val_accuracy: 0.5130 Epoch 58/150 283/283 [==============================] - 5s 18ms/step - loss: 1.5609 - accuracy: 0.5302 - val_loss: 1.6134 - val_accuracy: 0.5197 Epoch 59/150 283/283 [==============================] - 6s 19ms/step - loss: 1.5542 - accuracy: 0.5392 - val_loss: 1.6186 - val_accuracy: 0.5217 Epoch 60/150 283/283 [==============================] - 6s 18ms/step - loss: 1.5400 - accuracy: 0.5444 - val_loss: 1.6576 - val_accuracy: 0.5033 Epoch 61/150 283/283 [==============================] - 6s 18ms/step - loss: 1.5406 - accuracy: 0.5440 - val_loss: 1.6908 - val_accuracy: 0.4820 Epoch 62/150 283/283 [==============================] - 6s 19ms/step - loss: 1.5430 - accuracy: 0.5347 - val_loss: 1.6106 - val_accuracy: 0.5143 Epoch 63/150 283/283 [==============================] - 6s 18ms/step - loss: 1.5331 - accuracy: 0.5425 - val_loss: 1.6243 - val_accuracy: 0.5223 Epoch 64/150 283/283 [==============================] - 5s 18ms/step - loss: 1.5157 - accuracy: 0.5533 - val_loss: 1.5903 - val_accuracy: 0.5253 Epoch 65/150 283/283 [==============================] - 6s 18ms/step - loss: 1.5131 - accuracy: 0.5482 - val_loss: 1.5728 - val_accuracy: 0.5400 Epoch 66/150 283/283 [==============================] - 6s 18ms/step - loss: 1.5220 - accuracy: 0.5444 - val_loss: 1.6151 - val_accuracy: 0.5160 Epoch 67/150 283/283 [==============================] - 6s 18ms/step - loss: 1.5066 - accuracy: 0.5519 - val_loss: 1.5992 - val_accuracy: 0.5190 Epoch 68/150 283/283 [==============================] - 6s 18ms/step - loss: 1.5024 - accuracy: 0.5536 - val_loss: 1.6315 - val_accuracy: 0.4993 Epoch 69/150 283/283 [==============================] - 5s 18ms/step - loss: 1.4999 - accuracy: 0.5538 - val_loss: 1.5635 - val_accuracy: 0.5383 Epoch 70/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4925 - accuracy: 0.5527 - val_loss: 1.5275 - val_accuracy: 0.5543 Epoch 71/150 283/283 [==============================] - 6s 19ms/step - loss: 1.4889 - accuracy: 0.5543 - val_loss: 1.6014 - val_accuracy: 0.5293 Epoch 72/150 283/283 [==============================] - 6s 18ms/step - loss: 1.4865 - accuracy: 0.5572 - val_loss: 1.5454 - val_accuracy: 0.5430 Epoch 73/150 283/283 [==============================] - 5s 18ms/step - loss: 1.4771 - accuracy: 0.5574 - val_loss: 1.5375 - val_accuracy: 0.5370 Epoch 74/150 283/283 [==============================] - 6s 18ms/step - loss: 1.4859 - accuracy: 0.5531 - val_loss: 1.5618 - val_accuracy: 0.5170 Epoch 75/150 283/283 [==============================] - 6s 18ms/step - loss: 1.4725 - accuracy: 0.5639 - val_loss: 1.6893 - val_accuracy: 0.4910 Epoch 76/150 283/283 [==============================] - 6s 19ms/step - loss: 1.4619 - accuracy: 0.5645 - val_loss: 1.5204 - val_accuracy: 0.5510 Epoch 77/150 283/283 [==============================] - 6s 18ms/step - loss: 1.4580 - accuracy: 0.5613 - val_loss: 1.5291 - val_accuracy: 0.5523 Epoch 78/150 283/283 [==============================] - 5s 18ms/step - loss: 1.4620 - accuracy: 0.5647 - val_loss: 1.6186 - val_accuracy: 0.4887 Epoch 79/150 283/283 [==============================] - 5s 18ms/step - loss: 1.4569 - accuracy: 0.5656 - val_loss: 1.4905 - val_accuracy: 0.5590 Epoch 80/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4514 - accuracy: 0.5645 - val_loss: 1.4985 - val_accuracy: 0.5617 Epoch 81/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4461 - accuracy: 0.5667 - val_loss: 1.5102 - val_accuracy: 0.5467 Epoch 82/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4400 - accuracy: 0.5665 - val_loss: 1.5524 - val_accuracy: 0.5193 Epoch 83/150 283/283 [==============================] - 6s 18ms/step - loss: 1.4379 - accuracy: 0.5683 - val_loss: 1.4835 - val_accuracy: 0.5540 Epoch 84/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4385 - accuracy: 0.5693 - val_loss: 1.5261 - val_accuracy: 0.5273 Epoch 85/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4335 - accuracy: 0.5707 - val_loss: 1.5392 - val_accuracy: 0.5443 Epoch 86/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4275 - accuracy: 0.5726 - val_loss: 1.5308 - val_accuracy: 0.5333 Epoch 87/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4221 - accuracy: 0.5748 - val_loss: 1.4860 - val_accuracy: 0.5483 Epoch 88/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4182 - accuracy: 0.5757 - val_loss: 1.4827 - val_accuracy: 0.5597 Epoch 89/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4147 - accuracy: 0.5755 - val_loss: 1.5378 - val_accuracy: 0.5220 Epoch 90/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4070 - accuracy: 0.5821 - val_loss: 1.6334 - val_accuracy: 0.4970 Epoch 91/150 283/283 [==============================] - 5s 18ms/step - loss: 1.4165 - accuracy: 0.5710 - val_loss: 1.4493 - val_accuracy: 0.5670 Epoch 92/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4035 - accuracy: 0.5785 - val_loss: 1.4521 - val_accuracy: 0.5723 Epoch 93/150 283/283 [==============================] - 5s 18ms/step - loss: 1.3995 - accuracy: 0.5845 - val_loss: 1.4441 - val_accuracy: 0.5660 Epoch 94/150 283/283 [==============================] - 5s 17ms/step - loss: 1.4114 - accuracy: 0.5741 - val_loss: 1.4682 - val_accuracy: 0.5610 Epoch 95/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3956 - accuracy: 0.5786 - val_loss: 1.5790 - val_accuracy: 0.5043 Epoch 96/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3920 - accuracy: 0.5803 - val_loss: 1.4451 - val_accuracy: 0.5633 Epoch 97/150 283/283 [==============================] - 6s 18ms/step - loss: 1.3901 - accuracy: 0.5853 - val_loss: 1.4570 - val_accuracy: 0.5497 Epoch 98/150 283/283 [==============================] - 6s 19ms/step - loss: 1.3798 - accuracy: 0.5860 - val_loss: 1.4403 - val_accuracy: 0.5870 Epoch 99/150 283/283 [==============================] - 5s 18ms/step - loss: 1.3862 - accuracy: 0.5799 - val_loss: 1.4347 - val_accuracy: 0.5777 Epoch 100/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3854 - accuracy: 0.5836 - val_loss: 1.4068 - val_accuracy: 0.5890 Epoch 101/150 283/283 [==============================] - 5s 18ms/step - loss: 1.3752 - accuracy: 0.5873 - val_loss: 1.4457 - val_accuracy: 0.5690 Epoch 102/150 283/283 [==============================] - 6s 18ms/step - loss: 1.3673 - accuracy: 0.5862 - val_loss: 1.4547 - val_accuracy: 0.5623 Epoch 103/150 283/283 [==============================] - 5s 18ms/step - loss: 1.3739 - accuracy: 0.5872 - val_loss: 1.4405 - val_accuracy: 0.5880 Epoch 104/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3679 - accuracy: 0.5861 - val_loss: 1.4122 - val_accuracy: 0.5747 Epoch 105/150 283/283 [==============================] - 5s 18ms/step - loss: 1.3569 - accuracy: 0.5905 - val_loss: 1.4242 - val_accuracy: 0.5540 Epoch 106/150 283/283 [==============================] - 5s 18ms/step - loss: 1.3539 - accuracy: 0.5919 - val_loss: 1.4628 - val_accuracy: 0.5467 Epoch 107/150 283/283 [==============================] - 6s 18ms/step - loss: 1.3706 - accuracy: 0.5820 - val_loss: 1.4621 - val_accuracy: 0.5620 Epoch 108/150 283/283 [==============================] - 6s 19ms/step - loss: 1.3555 - accuracy: 0.5923 - val_loss: 1.4208 - val_accuracy: 0.5620 Epoch 109/150 283/283 [==============================] - 6s 18ms/step - loss: 1.3616 - accuracy: 0.5914 - val_loss: 1.4099 - val_accuracy: 0.5803 Epoch 110/150 283/283 [==============================] - 6s 18ms/step - loss: 1.3580 - accuracy: 0.5938 - val_loss: 1.4482 - val_accuracy: 0.5520 Epoch 111/150 283/283 [==============================] - 5s 18ms/step - loss: 1.3493 - accuracy: 0.5948 - val_loss: 1.4680 - val_accuracy: 0.5600 Epoch 112/150 283/283 [==============================] - 6s 18ms/step - loss: 1.3473 - accuracy: 0.5965 - val_loss: 1.4038 - val_accuracy: 0.5787 Epoch 113/150 283/283 [==============================] - 6s 18ms/step - loss: 1.3412 - accuracy: 0.5946 - val_loss: 1.4693 - val_accuracy: 0.5633 Epoch 114/150 283/283 [==============================] - 6s 18ms/step - loss: 1.3398 - accuracy: 0.5916 - val_loss: 1.4027 - val_accuracy: 0.5967 Epoch 115/150 283/283 [==============================] - 5s 18ms/step - loss: 1.3357 - accuracy: 0.5961 - val_loss: 1.3977 - val_accuracy: 0.5907 Epoch 116/150 283/283 [==============================] - 5s 18ms/step - loss: 1.3354 - accuracy: 0.5980 - val_loss: 1.3975 - val_accuracy: 0.5740 Epoch 117/150 283/283 [==============================] - 5s 18ms/step - loss: 1.3235 - accuracy: 0.5997 - val_loss: 1.3747 - val_accuracy: 0.5790 Epoch 118/150 283/283 [==============================] - 6s 18ms/step - loss: 1.3284 - accuracy: 0.5957 - val_loss: 1.3873 - val_accuracy: 0.5787 Epoch 119/150 283/283 [==============================] - 6s 18ms/step - loss: 1.3274 - accuracy: 0.6014 - val_loss: 1.4009 - val_accuracy: 0.5833 Epoch 120/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3214 - accuracy: 0.6015 - val_loss: 1.3758 - val_accuracy: 0.6000 Epoch 121/150 283/283 [==============================] - 5s 18ms/step - loss: 1.3139 - accuracy: 0.6076 - val_loss: 1.4082 - val_accuracy: 0.5683 Epoch 122/150 283/283 [==============================] - 5s 18ms/step - loss: 1.3171 - accuracy: 0.6045 - val_loss: 1.3891 - val_accuracy: 0.5763 Epoch 123/150 283/283 [==============================] - 6s 18ms/step - loss: 1.3110 - accuracy: 0.6039 - val_loss: 1.3606 - val_accuracy: 0.5923 Epoch 124/150 283/283 [==============================] - 5s 17ms/step - loss: 1.3130 - accuracy: 0.6043 - val_loss: 1.3926 - val_accuracy: 0.5873 Epoch 125/150 283/283 [==============================] - 5s 18ms/step - loss: 1.3106 - accuracy: 0.6011 - val_loss: 1.3309 - val_accuracy: 0.6030 Epoch 126/150 283/283 [==============================] - 6s 19ms/step - loss: 1.3060 - accuracy: 0.6059 - val_loss: 1.3774 - val_accuracy: 0.5690 Epoch 127/150 283/283 [==============================] - 5s 18ms/step - loss: 1.3048 - accuracy: 0.6062 - val_loss: 1.3760 - val_accuracy: 0.5743 Epoch 128/150 283/283 [==============================] - 5s 18ms/step - loss: 1.3085 - accuracy: 0.6027 - val_loss: 1.3417 - val_accuracy: 0.6153 Epoch 129/150 283/283 [==============================] - 5s 18ms/step - loss: 1.2916 - accuracy: 0.6099 - val_loss: 1.3631 - val_accuracy: 0.5973 Epoch 130/150 283/283 [==============================] - 5s 18ms/step - loss: 1.2974 - accuracy: 0.6072 - val_loss: 1.3460 - val_accuracy: 0.5993 Epoch 131/150 283/283 [==============================] - 5s 18ms/step - loss: 1.2929 - accuracy: 0.6105 - val_loss: 1.3467 - val_accuracy: 0.5903 Epoch 132/150 283/283 [==============================] - 6s 18ms/step - loss: 1.2818 - accuracy: 0.6136 - val_loss: 1.3232 - val_accuracy: 0.6053 Epoch 133/150 283/283 [==============================] - 6s 18ms/step - loss: 1.2931 - accuracy: 0.6100 - val_loss: 1.3426 - val_accuracy: 0.6007 Epoch 134/150 283/283 [==============================] - 6s 18ms/step - loss: 1.2829 - accuracy: 0.6092 - val_loss: 1.3649 - val_accuracy: 0.5767 Epoch 135/150 283/283 [==============================] - 5s 18ms/step - loss: 1.2822 - accuracy: 0.6123 - val_loss: 1.3208 - val_accuracy: 0.5997 Epoch 136/150 283/283 [==============================] - 5s 17ms/step - loss: 1.2799 - accuracy: 0.6101 - val_loss: 1.3000 - val_accuracy: 0.6197 Epoch 137/150 283/283 [==============================] - 6s 19ms/step - loss: 1.2728 - accuracy: 0.6166 - val_loss: 1.3418 - val_accuracy: 0.5973 Epoch 138/150 283/283 [==============================] - 6s 19ms/step - loss: 1.2818 - accuracy: 0.6187 - val_loss: 1.3629 - val_accuracy: 0.5737 Epoch 139/150 283/283 [==============================] - 6s 18ms/step - loss: 1.2746 - accuracy: 0.6141 - val_loss: 1.3287 - val_accuracy: 0.5877 Epoch 140/150 283/283 [==============================] - 5s 18ms/step - loss: 1.2800 - accuracy: 0.6102 - val_loss: 1.3584 - val_accuracy: 0.5870 Epoch 141/150 283/283 [==============================] - 6s 18ms/step - loss: 1.2734 - accuracy: 0.6132 - val_loss: 1.3030 - val_accuracy: 0.6083 Epoch 142/150 283/283 [==============================] - 6s 19ms/step - loss: 1.2634 - accuracy: 0.6184 - val_loss: 1.3377 - val_accuracy: 0.6133 Epoch 143/150 283/283 [==============================] - 6s 18ms/step - loss: 1.2655 - accuracy: 0.6123 - val_loss: 1.3184 - val_accuracy: 0.5980 Epoch 144/150 283/283 [==============================] - 6s 19ms/step - loss: 1.2610 - accuracy: 0.6197 - val_loss: 1.3210 - val_accuracy: 0.6177 Epoch 145/150 283/283 [==============================] - 6s 19ms/step - loss: 1.2585 - accuracy: 0.6162 - val_loss: 1.2888 - val_accuracy: 0.6193 Epoch 146/150 283/283 [==============================] - 5s 18ms/step - loss: 1.2584 - accuracy: 0.6224 - val_loss: 1.3954 - val_accuracy: 0.5757 Epoch 147/150 283/283 [==============================] - 5s 18ms/step - loss: 1.2620 - accuracy: 0.6186 - val_loss: 1.2872 - val_accuracy: 0.5977 Epoch 148/150 283/283 [==============================] - 6s 19ms/step - loss: 1.2517 - accuracy: 0.6203 - val_loss: 1.3053 - val_accuracy: 0.5923 Epoch 149/150 283/283 [==============================] - 6s 19ms/step - loss: 1.2509 - accuracy: 0.6197 - val_loss: 1.2885 - val_accuracy: 0.6140 Epoch 150/150 283/283 [==============================] - 5s 17ms/step - loss: 1.2486 - accuracy: 0.6249 - val_loss: 1.3337 - val_accuracy: 0.6287
history = baseline128desaturation_model.fit(res128desaturation_train_data, validation_data = res128desaturation_validation_data, epochs = 150, callbacks = earlyStopper)
cnnf.class_image(res128desaturation_full_data, class_names, cmap = "gray")
cnnf.model_metric_graph(history, baseline128desaturation_model, res128desaturation_validation_data, name = "baseline model for 128x128 image (using desaturation method)", class_names = class_names)
Epoch 1/150 283/283 [==============================] - 7s 20ms/step - loss: 2.6102 - accuracy: 0.1426 - val_loss: 2.7569 - val_accuracy: 0.1147 Epoch 2/150 283/283 [==============================] - 6s 20ms/step - loss: 2.4965 - accuracy: 0.1989 - val_loss: 2.6472 - val_accuracy: 0.1853 Epoch 3/150 283/283 [==============================] - 6s 20ms/step - loss: 2.4159 - accuracy: 0.2409 - val_loss: 2.5997 - val_accuracy: 0.1730 Epoch 4/150 283/283 [==============================] - 6s 20ms/step - loss: 2.3631 - accuracy: 0.2584 - val_loss: 2.5061 - val_accuracy: 0.2063 Epoch 5/150 283/283 [==============================] - 6s 20ms/step - loss: 2.3123 - accuracy: 0.2720 - val_loss: 2.4435 - val_accuracy: 0.2197 Epoch 6/150 283/283 [==============================] - 6s 19ms/step - loss: 2.2634 - accuracy: 0.2812 - val_loss: 2.3851 - val_accuracy: 0.2160 Epoch 7/150 283/283 [==============================] - 6s 18ms/step - loss: 2.2193 - accuracy: 0.3036 - val_loss: 2.3640 - val_accuracy: 0.2280 Epoch 8/150 283/283 [==============================] - 6s 18ms/step - loss: 2.1774 - accuracy: 0.3286 - val_loss: 2.2765 - val_accuracy: 0.2630 Epoch 9/150 283/283 [==============================] - 6s 19ms/step - loss: 2.1427 - accuracy: 0.3385 - val_loss: 2.2628 - val_accuracy: 0.2840 Epoch 10/150 283/283 [==============================] - 5s 18ms/step - loss: 2.1080 - accuracy: 0.3470 - val_loss: 2.2673 - val_accuracy: 0.2740 Epoch 11/150 283/283 [==============================] - 5s 18ms/step - loss: 2.0877 - accuracy: 0.3573 - val_loss: 2.2353 - val_accuracy: 0.2953 Epoch 12/150 283/283 [==============================] - 6s 19ms/step - loss: 2.0659 - accuracy: 0.3556 - val_loss: 2.1905 - val_accuracy: 0.2950 Epoch 13/150 283/283 [==============================] - 5s 18ms/step - loss: 2.0458 - accuracy: 0.3661 - val_loss: 2.1435 - val_accuracy: 0.3097 Epoch 14/150 283/283 [==============================] - 6s 20ms/step - loss: 2.0367 - accuracy: 0.3694 - val_loss: 2.1633 - val_accuracy: 0.3063 Epoch 15/150 283/283 [==============================] - 6s 19ms/step - loss: 2.0195 - accuracy: 0.3766 - val_loss: 2.1196 - val_accuracy: 0.3070 Epoch 16/150 283/283 [==============================] - 6s 19ms/step - loss: 2.0152 - accuracy: 0.3747 - val_loss: 2.0998 - val_accuracy: 0.3157 Epoch 17/150 283/283 [==============================] - 6s 19ms/step - loss: 1.9993 - accuracy: 0.3732 - val_loss: 2.0893 - val_accuracy: 0.3337 Epoch 18/150 283/283 [==============================] - 6s 19ms/step - loss: 1.9909 - accuracy: 0.3820 - val_loss: 2.1594 - val_accuracy: 0.3087 Epoch 19/150 283/283 [==============================] - 6s 19ms/step - loss: 1.9802 - accuracy: 0.3842 - val_loss: 2.0845 - val_accuracy: 0.3257 Epoch 20/150 283/283 [==============================] - 6s 18ms/step - loss: 1.9749 - accuracy: 0.3881 - val_loss: 2.0641 - val_accuracy: 0.3287 Epoch 21/150 283/283 [==============================] - 5s 18ms/step - loss: 1.9631 - accuracy: 0.3922 - val_loss: 2.0569 - val_accuracy: 0.3257 Epoch 22/150 283/283 [==============================] - 6s 18ms/step - loss: 1.9545 - accuracy: 0.3889 - val_loss: 2.0403 - val_accuracy: 0.3310 Epoch 23/150 283/283 [==============================] - 6s 18ms/step - loss: 1.9425 - accuracy: 0.3949 - val_loss: 2.0359 - val_accuracy: 0.3403 Epoch 24/150 283/283 [==============================] - 5s 18ms/step - loss: 1.9336 - accuracy: 0.3992 - val_loss: 2.0372 - val_accuracy: 0.3447 Epoch 25/150 283/283 [==============================] - 6s 18ms/step - loss: 1.9222 - accuracy: 0.4082 - val_loss: 2.0732 - val_accuracy: 0.3400 Epoch 26/150 283/283 [==============================] - 6s 19ms/step - loss: 1.9185 - accuracy: 0.3996 - val_loss: 2.1175 - val_accuracy: 0.3237 Epoch 27/150 283/283 [==============================] - 6s 19ms/step - loss: 1.9039 - accuracy: 0.4102 - val_loss: 2.0213 - val_accuracy: 0.3597 Epoch 28/150 283/283 [==============================] - 6s 19ms/step - loss: 1.8989 - accuracy: 0.4092 - val_loss: 2.0140 - val_accuracy: 0.3530 Epoch 29/150 283/283 [==============================] - 6s 18ms/step - loss: 1.8837 - accuracy: 0.4173 - val_loss: 1.9982 - val_accuracy: 0.3643 Epoch 30/150 283/283 [==============================] - 6s 19ms/step - loss: 1.8793 - accuracy: 0.4226 - val_loss: 2.0238 - val_accuracy: 0.3577 Epoch 31/150 283/283 [==============================] - 6s 18ms/step - loss: 1.8620 - accuracy: 0.4274 - val_loss: 1.9689 - val_accuracy: 0.3723 Epoch 32/150 283/283 [==============================] - 6s 19ms/step - loss: 1.8566 - accuracy: 0.4278 - val_loss: 1.9579 - val_accuracy: 0.3720 Epoch 33/150 283/283 [==============================] - 6s 19ms/step - loss: 1.8434 - accuracy: 0.4304 - val_loss: 1.9803 - val_accuracy: 0.3637 Epoch 34/150 283/283 [==============================] - 5s 18ms/step - loss: 1.8364 - accuracy: 0.4339 - val_loss: 1.9402 - val_accuracy: 0.3733 Epoch 35/150 283/283 [==============================] - 6s 18ms/step - loss: 1.8198 - accuracy: 0.4423 - val_loss: 1.9816 - val_accuracy: 0.3707 Epoch 36/150 283/283 [==============================] - 6s 19ms/step - loss: 1.8067 - accuracy: 0.4426 - val_loss: 1.9196 - val_accuracy: 0.3743 Epoch 37/150 283/283 [==============================] - 6s 19ms/step - loss: 1.7969 - accuracy: 0.4448 - val_loss: 2.0591 - val_accuracy: 0.3483 Epoch 38/150 283/283 [==============================] - 6s 20ms/step - loss: 1.7882 - accuracy: 0.4505 - val_loss: 1.8716 - val_accuracy: 0.4007 Epoch 39/150 283/283 [==============================] - 6s 19ms/step - loss: 1.7722 - accuracy: 0.4559 - val_loss: 1.8928 - val_accuracy: 0.3963 Epoch 40/150 283/283 [==============================] - 6s 20ms/step - loss: 1.7635 - accuracy: 0.4567 - val_loss: 1.9044 - val_accuracy: 0.3973 Epoch 41/150 283/283 [==============================] - 6s 20ms/step - loss: 1.7506 - accuracy: 0.4626 - val_loss: 1.8362 - val_accuracy: 0.4050 Epoch 42/150 283/283 [==============================] - 6s 19ms/step - loss: 1.7427 - accuracy: 0.4659 - val_loss: 1.8671 - val_accuracy: 0.4050 Epoch 43/150 283/283 [==============================] - 6s 18ms/step - loss: 1.7249 - accuracy: 0.4667 - val_loss: 1.8166 - val_accuracy: 0.4120 Epoch 44/150 283/283 [==============================] - 6s 20ms/step - loss: 1.7165 - accuracy: 0.4745 - val_loss: 1.8433 - val_accuracy: 0.4050 Epoch 45/150 283/283 [==============================] - 6s 19ms/step - loss: 1.7023 - accuracy: 0.4759 - val_loss: 1.8286 - val_accuracy: 0.4197 Epoch 46/150 283/283 [==============================] - 6s 18ms/step - loss: 1.6924 - accuracy: 0.4784 - val_loss: 1.8084 - val_accuracy: 0.4247 Epoch 47/150 283/283 [==============================] - 6s 20ms/step - loss: 1.6792 - accuracy: 0.4865 - val_loss: 1.8430 - val_accuracy: 0.4143 Epoch 48/150 283/283 [==============================] - 6s 19ms/step - loss: 1.6671 - accuracy: 0.4909 - val_loss: 1.7885 - val_accuracy: 0.4200 Epoch 49/150 283/283 [==============================] - 6s 19ms/step - loss: 1.6584 - accuracy: 0.4900 - val_loss: 1.7715 - val_accuracy: 0.4290 Epoch 50/150 283/283 [==============================] - 6s 18ms/step - loss: 1.6567 - accuracy: 0.4948 - val_loss: 1.7766 - val_accuracy: 0.4283 Epoch 51/150 283/283 [==============================] - 5s 18ms/step - loss: 1.6395 - accuracy: 0.4950 - val_loss: 1.8398 - val_accuracy: 0.4093 Epoch 52/150 283/283 [==============================] - 6s 19ms/step - loss: 1.6324 - accuracy: 0.5022 - val_loss: 1.7525 - val_accuracy: 0.4437 Epoch 53/150 283/283 [==============================] - 6s 19ms/step - loss: 1.6215 - accuracy: 0.5027 - val_loss: 1.7649 - val_accuracy: 0.4413 Epoch 54/150 283/283 [==============================] - 6s 19ms/step - loss: 1.6083 - accuracy: 0.5069 - val_loss: 1.6994 - val_accuracy: 0.4610 Epoch 55/150 283/283 [==============================] - 6s 19ms/step - loss: 1.6003 - accuracy: 0.5071 - val_loss: 1.7128 - val_accuracy: 0.4540 Epoch 56/150 283/283 [==============================] - 6s 18ms/step - loss: 1.6015 - accuracy: 0.5054 - val_loss: 1.7264 - val_accuracy: 0.4490 Epoch 57/150 283/283 [==============================] - 6s 19ms/step - loss: 1.5829 - accuracy: 0.5148 - val_loss: 1.6680 - val_accuracy: 0.4657 Epoch 58/150 283/283 [==============================] - 6s 19ms/step - loss: 1.5742 - accuracy: 0.5125 - val_loss: 1.6950 - val_accuracy: 0.4590 Epoch 59/150 283/283 [==============================] - 6s 19ms/step - loss: 1.5798 - accuracy: 0.5124 - val_loss: 1.7131 - val_accuracy: 0.4313 Epoch 60/150 283/283 [==============================] - 6s 20ms/step - loss: 1.5683 - accuracy: 0.5160 - val_loss: 1.6784 - val_accuracy: 0.4677 Epoch 61/150 283/283 [==============================] - 6s 19ms/step - loss: 1.5537 - accuracy: 0.5222 - val_loss: 1.6850 - val_accuracy: 0.4640 Epoch 62/150 283/283 [==============================] - 6s 18ms/step - loss: 1.5529 - accuracy: 0.5200 - val_loss: 1.6416 - val_accuracy: 0.4810 Epoch 63/150 283/283 [==============================] - 6s 18ms/step - loss: 1.5389 - accuracy: 0.5233 - val_loss: 1.6832 - val_accuracy: 0.4483 Epoch 64/150 283/283 [==============================] - 6s 19ms/step - loss: 1.5399 - accuracy: 0.5268 - val_loss: 1.6924 - val_accuracy: 0.4620 Epoch 65/150 283/283 [==============================] - 6s 20ms/step - loss: 1.5288 - accuracy: 0.5305 - val_loss: 1.6469 - val_accuracy: 0.4733 Epoch 66/150 283/283 [==============================] - 6s 19ms/step - loss: 1.5219 - accuracy: 0.5287 - val_loss: 1.6156 - val_accuracy: 0.4793 Epoch 67/150 283/283 [==============================] - 6s 19ms/step - loss: 1.5168 - accuracy: 0.5341 - val_loss: 1.5978 - val_accuracy: 0.4900 Epoch 68/150 283/283 [==============================] - 6s 19ms/step - loss: 1.5190 - accuracy: 0.5316 - val_loss: 1.6261 - val_accuracy: 0.4813 Epoch 69/150 283/283 [==============================] - 6s 19ms/step - loss: 1.5079 - accuracy: 0.5362 - val_loss: 1.6210 - val_accuracy: 0.4640 Epoch 70/150 283/283 [==============================] - 6s 19ms/step - loss: 1.5058 - accuracy: 0.5343 - val_loss: 1.6225 - val_accuracy: 0.4680 Epoch 71/150 283/283 [==============================] - 6s 20ms/step - loss: 1.5005 - accuracy: 0.5323 - val_loss: 1.6263 - val_accuracy: 0.4837 Epoch 72/150 283/283 [==============================] - 6s 20ms/step - loss: 1.4924 - accuracy: 0.5371 - val_loss: 1.6071 - val_accuracy: 0.4923 Epoch 73/150 283/283 [==============================] - 6s 20ms/step - loss: 1.4891 - accuracy: 0.5387 - val_loss: 1.5888 - val_accuracy: 0.4923 Epoch 74/150 283/283 [==============================] - 6s 20ms/step - loss: 1.4874 - accuracy: 0.5447 - val_loss: 1.5919 - val_accuracy: 0.4923 Epoch 75/150 283/283 [==============================] - 6s 20ms/step - loss: 1.4810 - accuracy: 0.5456 - val_loss: 1.5826 - val_accuracy: 0.5043 Epoch 76/150 283/283 [==============================] - 6s 20ms/step - loss: 1.4840 - accuracy: 0.5415 - val_loss: 1.6069 - val_accuracy: 0.4680 Epoch 77/150 283/283 [==============================] - 6s 20ms/step - loss: 1.4725 - accuracy: 0.5420 - val_loss: 1.5421 - val_accuracy: 0.5087 Epoch 78/150 283/283 [==============================] - 6s 21ms/step - loss: 1.4657 - accuracy: 0.5438 - val_loss: 1.5525 - val_accuracy: 0.5033 Epoch 79/150 283/283 [==============================] - 6s 20ms/step - loss: 1.4614 - accuracy: 0.5474 - val_loss: 1.5563 - val_accuracy: 0.4980 Epoch 80/150 283/283 [==============================] - 6s 20ms/step - loss: 1.4541 - accuracy: 0.5477 - val_loss: 1.5777 - val_accuracy: 0.4987 Epoch 81/150 283/283 [==============================] - 6s 18ms/step - loss: 1.4488 - accuracy: 0.5484 - val_loss: 1.5623 - val_accuracy: 0.4840 Epoch 82/150 283/283 [==============================] - 6s 19ms/step - loss: 1.4514 - accuracy: 0.5533 - val_loss: 1.5643 - val_accuracy: 0.5047 Epoch 83/150 283/283 [==============================] - 6s 19ms/step - loss: 1.4515 - accuracy: 0.5461 - val_loss: 1.5676 - val_accuracy: 0.4897 Epoch 84/150 283/283 [==============================] - 6s 19ms/step - loss: 1.4472 - accuracy: 0.5533 - val_loss: 1.6115 - val_accuracy: 0.4690 Epoch 85/150 283/283 [==============================] - 6s 18ms/step - loss: 1.4319 - accuracy: 0.5589 - val_loss: 1.5467 - val_accuracy: 0.4910 Epoch 86/150 283/283 [==============================] - 6s 19ms/step - loss: 1.4298 - accuracy: 0.5551 - val_loss: 1.4988 - val_accuracy: 0.5203 Epoch 87/150 283/283 [==============================] - 6s 19ms/step - loss: 1.4262 - accuracy: 0.5542 - val_loss: 1.5914 - val_accuracy: 0.4887 Epoch 88/150 283/283 [==============================] - 6s 19ms/step - loss: 1.4214 - accuracy: 0.5601 - val_loss: 1.5237 - val_accuracy: 0.5143 Epoch 89/150 283/283 [==============================] - 6s 19ms/step - loss: 1.4171 - accuracy: 0.5593 - val_loss: 1.5236 - val_accuracy: 0.5070 Epoch 90/150 283/283 [==============================] - 6s 20ms/step - loss: 1.4162 - accuracy: 0.5595 - val_loss: 1.5430 - val_accuracy: 0.5050 Epoch 91/150 283/283 [==============================] - 6s 20ms/step - loss: 1.4066 - accuracy: 0.5673 - val_loss: 1.4819 - val_accuracy: 0.5257 Epoch 92/150 283/283 [==============================] - 6s 19ms/step - loss: 1.4068 - accuracy: 0.5619 - val_loss: 1.5051 - val_accuracy: 0.5073 Epoch 93/150 283/283 [==============================] - 6s 19ms/step - loss: 1.4112 - accuracy: 0.5596 - val_loss: 1.5007 - val_accuracy: 0.5167 Epoch 94/150 283/283 [==============================] - 6s 19ms/step - loss: 1.3997 - accuracy: 0.5629 - val_loss: 1.4935 - val_accuracy: 0.5310 Epoch 95/150 283/283 [==============================] - 6s 20ms/step - loss: 1.4011 - accuracy: 0.5656 - val_loss: 1.4748 - val_accuracy: 0.5363 Epoch 96/150 283/283 [==============================] - 6s 20ms/step - loss: 1.3906 - accuracy: 0.5673 - val_loss: 1.4938 - val_accuracy: 0.5107 Epoch 97/150 283/283 [==============================] - 6s 21ms/step - loss: 1.3998 - accuracy: 0.5614 - val_loss: 1.4726 - val_accuracy: 0.5307 Epoch 98/150 283/283 [==============================] - 7s 22ms/step - loss: 1.3908 - accuracy: 0.5662 - val_loss: 1.4529 - val_accuracy: 0.5413 Epoch 99/150 283/283 [==============================] - 7s 22ms/step - loss: 1.3828 - accuracy: 0.5738 - val_loss: 1.4762 - val_accuracy: 0.5230 Epoch 100/150 283/283 [==============================] - 7s 22ms/step - loss: 1.3779 - accuracy: 0.5708 - val_loss: 1.4524 - val_accuracy: 0.5377 Epoch 101/150 283/283 [==============================] - 6s 20ms/step - loss: 1.3733 - accuracy: 0.5755 - val_loss: 1.4700 - val_accuracy: 0.5320 Epoch 102/150 283/283 [==============================] - 6s 19ms/step - loss: 1.3751 - accuracy: 0.5718 - val_loss: 1.4555 - val_accuracy: 0.5253 Epoch 103/150 283/283 [==============================] - 6s 20ms/step - loss: 1.3670 - accuracy: 0.5755 - val_loss: 1.4904 - val_accuracy: 0.5060 Epoch 104/150 283/283 [==============================] - 6s 19ms/step - loss: 1.3630 - accuracy: 0.5726 - val_loss: 1.4542 - val_accuracy: 0.5410 Epoch 105/150 283/283 [==============================] - 6s 20ms/step - loss: 1.3666 - accuracy: 0.5753 - val_loss: 1.4691 - val_accuracy: 0.5313 Epoch 106/150 283/283 [==============================] - 8s 26ms/step - loss: 1.3562 - accuracy: 0.5771 - val_loss: 1.4281 - val_accuracy: 0.5393 Epoch 107/150 283/283 [==============================] - 7s 24ms/step - loss: 1.3525 - accuracy: 0.5812 - val_loss: 1.4629 - val_accuracy: 0.5293 Epoch 108/150 283/283 [==============================] - 6s 21ms/step - loss: 1.3495 - accuracy: 0.5834 - val_loss: 1.4649 - val_accuracy: 0.5287 Epoch 109/150 283/283 [==============================] - 6s 21ms/step - loss: 1.3426 - accuracy: 0.5784 - val_loss: 1.5098 - val_accuracy: 0.4933 Epoch 110/150 283/283 [==============================] - 6s 20ms/step - loss: 1.3541 - accuracy: 0.5785 - val_loss: 1.4426 - val_accuracy: 0.5323 Epoch 111/150 283/283 [==============================] - 6s 21ms/step - loss: 1.3405 - accuracy: 0.5813 - val_loss: 1.4394 - val_accuracy: 0.5350 Epoch 112/150 283/283 [==============================] - 6s 21ms/step - loss: 1.3362 - accuracy: 0.5822 - val_loss: 1.4045 - val_accuracy: 0.5577 Epoch 113/150 283/283 [==============================] - 6s 19ms/step - loss: 1.3393 - accuracy: 0.5801 - val_loss: 1.4593 - val_accuracy: 0.5550 Epoch 114/150 283/283 [==============================] - 6s 20ms/step - loss: 1.3353 - accuracy: 0.5864 - val_loss: 1.4318 - val_accuracy: 0.5580 Epoch 115/150 283/283 [==============================] - 6s 20ms/step - loss: 1.3331 - accuracy: 0.5850 - val_loss: 1.5539 - val_accuracy: 0.4793 Epoch 116/150 283/283 [==============================] - 6s 20ms/step - loss: 1.3326 - accuracy: 0.5824 - val_loss: 1.4167 - val_accuracy: 0.5550 Epoch 117/150 283/283 [==============================] - 6s 20ms/step - loss: 1.3224 - accuracy: 0.5866 - val_loss: 1.4175 - val_accuracy: 0.5493 Epoch 118/150 283/283 [==============================] - 6s 21ms/step - loss: 1.3198 - accuracy: 0.5905 - val_loss: 1.3844 - val_accuracy: 0.5633 Epoch 119/150 283/283 [==============================] - 6s 20ms/step - loss: 1.3233 - accuracy: 0.5904 - val_loss: 1.3924 - val_accuracy: 0.5630 Epoch 120/150 283/283 [==============================] - 6s 21ms/step - loss: 1.3162 - accuracy: 0.5896 - val_loss: 1.4084 - val_accuracy: 0.5383 Epoch 121/150 283/283 [==============================] - 6s 20ms/step - loss: 1.3054 - accuracy: 0.5975 - val_loss: 1.3621 - val_accuracy: 0.5723 Epoch 122/150 283/283 [==============================] - 6s 20ms/step - loss: 1.3085 - accuracy: 0.5901 - val_loss: 1.4888 - val_accuracy: 0.5053 Epoch 123/150 283/283 [==============================] - 6s 20ms/step - loss: 1.3183 - accuracy: 0.5875 - val_loss: 1.3833 - val_accuracy: 0.5730 Epoch 124/150 283/283 [==============================] - 6s 20ms/step - loss: 1.3001 - accuracy: 0.5976 - val_loss: 1.4048 - val_accuracy: 0.5563 Epoch 125/150 283/283 [==============================] - 6s 20ms/step - loss: 1.2946 - accuracy: 0.5958 - val_loss: 1.3901 - val_accuracy: 0.5660 Epoch 126/150 283/283 [==============================] - 6s 21ms/step - loss: 1.2986 - accuracy: 0.5979 - val_loss: 1.3900 - val_accuracy: 0.5463 Epoch 127/150 283/283 [==============================] - 6s 20ms/step - loss: 1.2943 - accuracy: 0.5992 - val_loss: 1.4133 - val_accuracy: 0.5467 Epoch 128/150 283/283 [==============================] - 6s 20ms/step - loss: 1.2919 - accuracy: 0.5994 - val_loss: 1.3576 - val_accuracy: 0.5693 Epoch 129/150 283/283 [==============================] - 6s 20ms/step - loss: 1.2932 - accuracy: 0.5973 - val_loss: 1.3512 - val_accuracy: 0.5750 Epoch 130/150 283/283 [==============================] - 6s 21ms/step - loss: 1.2860 - accuracy: 0.5959 - val_loss: 1.3800 - val_accuracy: 0.5613 Epoch 131/150 283/283 [==============================] - 6s 19ms/step - loss: 1.2833 - accuracy: 0.6042 - val_loss: 1.5212 - val_accuracy: 0.5180 Epoch 132/150 283/283 [==============================] - 5s 18ms/step - loss: 1.2843 - accuracy: 0.5958 - val_loss: 1.3550 - val_accuracy: 0.5783 Epoch 133/150 283/283 [==============================] - 6s 20ms/step - loss: 1.2728 - accuracy: 0.6031 - val_loss: 1.3965 - val_accuracy: 0.5647 Epoch 134/150 283/283 [==============================] - 5s 18ms/step - loss: 1.2671 - accuracy: 0.6061 - val_loss: 1.3845 - val_accuracy: 0.5560 Epoch 135/150 283/283 [==============================] - 6s 19ms/step - loss: 1.2789 - accuracy: 0.5984 - val_loss: 1.3724 - val_accuracy: 0.5573 Epoch 136/150 283/283 [==============================] - 6s 18ms/step - loss: 1.2713 - accuracy: 0.5989 - val_loss: 1.3627 - val_accuracy: 0.5557 Epoch 137/150 283/283 [==============================] - 6s 20ms/step - loss: 1.2643 - accuracy: 0.6030 - val_loss: 1.3537 - val_accuracy: 0.5737 Epoch 138/150 283/283 [==============================] - 6s 19ms/step - loss: 1.2703 - accuracy: 0.6043 - val_loss: 1.3661 - val_accuracy: 0.5757 Epoch 139/150 283/283 [==============================] - 6s 19ms/step - loss: 1.2646 - accuracy: 0.6015 - val_loss: 1.3388 - val_accuracy: 0.5780 Epoch 140/150 283/283 [==============================] - 6s 19ms/step - loss: 1.2547 - accuracy: 0.6072 - val_loss: 1.3191 - val_accuracy: 0.5790 Epoch 141/150 283/283 [==============================] - 6s 20ms/step - loss: 1.2556 - accuracy: 0.6061 - val_loss: 1.3292 - val_accuracy: 0.5733 Epoch 142/150 283/283 [==============================] - 6s 19ms/step - loss: 1.2515 - accuracy: 0.6091 - val_loss: 1.3956 - val_accuracy: 0.5443 Epoch 143/150 283/283 [==============================] - 6s 19ms/step - loss: 1.2570 - accuracy: 0.6045 - val_loss: 1.3551 - val_accuracy: 0.5613 Epoch 144/150 283/283 [==============================] - 6s 20ms/step - loss: 1.2469 - accuracy: 0.6057 - val_loss: 1.3258 - val_accuracy: 0.5820 Epoch 145/150 283/283 [==============================] - 6s 20ms/step - loss: 1.2403 - accuracy: 0.6167 - val_loss: 1.3406 - val_accuracy: 0.5597 Epoch 146/150 283/283 [==============================] - 6s 20ms/step - loss: 1.2382 - accuracy: 0.6159 - val_loss: 1.3075 - val_accuracy: 0.5807 Epoch 147/150 283/283 [==============================] - 6s 20ms/step - loss: 1.2416 - accuracy: 0.6107 - val_loss: 1.3137 - val_accuracy: 0.5940 Epoch 148/150 283/283 [==============================] - 6s 20ms/step - loss: 1.2443 - accuracy: 0.6083 - val_loss: 1.3479 - val_accuracy: 0.5753 Epoch 149/150 283/283 [==============================] - 6s 20ms/step - loss: 1.2391 - accuracy: 0.6153 - val_loss: 1.3397 - val_accuracy: 0.5690 Epoch 150/150 283/283 [==============================] - 6s 20ms/step - loss: 1.2366 - accuracy: 0.6134 - val_loss: 1.3032 - val_accuracy: 0.5860
res31_model_list = [baseline31_model, baseline31luminosity_model, baseline31desaturation_model]
res31_data_list = [res31greyscale_validation_data, res31luminosity_validation_data, res31desaturation_validation_data]
name_list = ["Normally Greyscaled", "Luminosity Method", "Desaturation Method"]
cnnf.model_comparison(res31_model_list, res31_data_list, name_list)
94/94 [==============================] - 1s 10ms/step - loss: 1.3386 - accuracy: 0.5730 94/94 [==============================] - 1s 10ms/step - loss: 1.2177 - accuracy: 0.6267 94/94 [==============================] - 1s 11ms/step - loss: 1.1963 - accuracy: 0.6060
From the above, we can see that for the 31 x 31 image size, the Luminosity Method is the best to use, yielding the highest overall classification accuracy, and the second-best model loss. Hence, for the 31 x 31 image size, we shall use the images greyscaled with the Luminosity method.
res128_model_list = [baseline128_model, baseline128luminosity_model, baseline128desaturation_model]
res128_data_list = [res128greyscale_validation_data, res128luminosity_validation_data, res128desaturation_validation_data]
name_list = ["Normally Greyscaled", "Luminosity Method", "Desaturation Method"]
cnnf.model_comparison(res128_model_list, res128_data_list, name_list)
94/94 [==============================] - 5s 37ms/step - loss: 1.3631 - accuracy: 0.5850 94/94 [==============================] - 1s 11ms/step - loss: 1.3337 - accuracy: 0.6287 94/94 [==============================] - 2s 14ms/step - loss: 1.3032 - accuracy: 0.5860
From the above comparison, we can see that for the 128 x 128 image size, the Luminosity Method is the best to use, as it gives the highest overall classification accuracy, as well as the second-best overall loss. Hence, we shall continue with the 128 x 128 image size using data greyscaled by the Luminosity Method.
Visualization of baseline model for 31 x 31 images.
tf.keras.utils.plot_model(baseline31_model, show_shapes=True, dpi=80)
tf.keras.utils.plot_model(baseline128_model, show_shapes=True, dpi=80)
Now, we shall normalize the final greyscaled data. Earlier on in the dataset, we did some basic normalization, namely, dividing it by 255. However, we shall now go a step further and shift the values, obtaining a standard deviation of 1, and a mean of 0.5 using Standardization. The formula for standardization is as follows: $$ \text Normalized\enspace Pixel\enspace Value = \frac{Pixel Value - \mu}{\theta} $$
# Function to compute mean and standard deviation for a dataset
def compute_mean_std(dataset, batch_size=32):
# Initialize sum and sum of squares
sum_images = 0
sum_sq_images = 0
num_samples = 0
for images, _ in dataset.batch(batch_size):
sum_images += tf.reduce_sum(images, axis=[0, 1, 2])
sum_sq_images += tf.reduce_sum(tf.square(images), axis=[0, 1, 2])
num_samples += tf.cast(tf.reduce_prod(tf.shape(images)[0:3]), tf.float32)
# Compute mean and variance
mean = sum_images / num_samples
variance = (sum_sq_images / num_samples) - tf.square(mean)
stddev = tf.sqrt(variance + 1e-7)
return mean, stddev
# Function to standardize images
def standardize_image(image, label, mean, stddev):
return (image - mean) / (stddev + 1e-7), label
# Compute mean and stddev for each dataset
# Make sure your datasets are not batched here
mean_res31, stddev_res31 = compute_mean_std(res31luminosity_train_data.unbatch())
mean_res128, stddev_res128 = compute_mean_std(res128luminosity_train_data.unbatch())
# Apply standardization to the datasets
std_res31_train = res31luminosity_train_data.map(
lambda image, label: standardize_image(image, label, mean_res31, stddev_res31),
num_parallel_calls=tf.data.AUTOTUNE
)
std_res31_validation = res31luminosity_validation_data.map(
lambda image, label: standardize_image(image, label, mean_res31, stddev_res31),
num_parallel_calls=tf.data.AUTOTUNE
)
std_res31_test = res31luminosity_test_data.map(
lambda image, label: standardize_image(image, label, mean_res31, stddev_res31),
num_parallel_calls=tf.data.AUTOTUNE
)
std_res31_full = std_res31_train.concatenate(std_res31_validation)
std_res31_full = std_res31_full.concatenate(std_res31_test)
# Apply standardization to the res128 datasets
std_res128_train = res128luminosity_train_data.map(
lambda image, label: standardize_image(image, label, mean_res128, stddev_res128),
num_parallel_calls=tf.data.AUTOTUNE
)
std_res128_validation = res128luminosity_validation_data.map(
lambda image, label: standardize_image(image, label, mean_res128, stddev_res128),
num_parallel_calls=tf.data.AUTOTUNE
)
std_res128_test = res128luminosity_test_data.map(
lambda image, label: standardize_image(image, label, mean_res128, stddev_res128),
num_parallel_calls=tf.data.AUTOTUNE
)
std_res128_full = std_res128_train.concatenate(std_res128_validation)
std_res128_full = std_res128_full.concatenate(std_res128_test)
for image, label in res31luminosity_train_data.unbatch().take(1):
original_image = image.numpy()
normalized_image, label = standardize_image(image, label, mean_res31, stddev_res31)
normalized_image = normalized_image.numpy()
# Plotting the original and normalized images
plt.figure(figsize=(12, 6))
plt.subplot(1, 2, 1)
plt.imshow(original_image, cmap = "gray")
plt.title("Original Image")
plt.axis('off')
plt.subplot(1, 2, 2)
plt.imshow(normalized_image, cmap = "gray")
plt.title("Normalized Image")
plt.axis('off')
plt.show()
for image, label in res128luminosity_train_data.unbatch().take(1):
original_image = image.numpy()
normalized_image, label = standardize_image(image, label, mean_res31, stddev_res31)
normalized_image = normalized_image.numpy()
# Plotting the original and normalized images
plt.figure(figsize=(12, 6))
plt.subplot(1, 2, 1)
plt.imshow(original_image, cmap = "gray")
plt.title("Original Image")
plt.axis('off')
plt.subplot(1, 2, 2)
plt.imshow(normalized_image, cmap = "gray")
plt.title("Normalized Image")
plt.axis('off')
plt.show()
cnnf.average_images_per_class(std_res31_full.unbatch(), class_names)
cnnf.average_images_per_class(std_res128_full.unbatch(), class_names)
From the above 4 plots, we can see that there is a slight difference after normalizing the images. The reason for the difference being this small may be due to the original mean already being close to the target of 0.5, hence requiring minimal shift in the pixel's brightness. We can also see that there are some differences in the features between the classes, with some having black centers, and some having whites. Some of the classes also have a "spot" in the middle, while some have the black/white spread through out the image. This tells us that there are potential features for the later models to pick apart, helping it to identify the individual classees.
Given that we already started using a basic CNN as a baseline model earlier, we shall now move onto using more complicated models for our task. We shall base our more complicated models off of the already established VGG and InceptionNet architectures, modifying them slightly to fit our use cases.
Furthermore, as these models learn quickly compared to the baseline models we created earlier, we shall increase the patience of our EarlyStopper, to give the models more time to learn.
earlyStopper = EarlyStopping(monitor = 'val_loss', patience = 30)
What is VGG?

vgg16Model = VGG16(
include_top=True,
weights='imagenet',
input_tensor=None,
input_shape=None,
pooling=None,
classes=1000,
classifier_activation='softmax'
)
vgg16Model.summary()
Model: "vgg16"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_15 (InputLayer) [(None, 224, 224, 3)] 0
block1_conv1 (Conv2D) (None, 224, 224, 64) 1792
block1_conv2 (Conv2D) (None, 224, 224, 64) 36928
block1_pool (MaxPooling2D) (None, 112, 112, 64) 0
block2_conv1 (Conv2D) (None, 112, 112, 128) 73856
block2_conv2 (Conv2D) (None, 112, 112, 128) 147584
block2_pool (MaxPooling2D) (None, 56, 56, 128) 0
block3_conv1 (Conv2D) (None, 56, 56, 256) 295168
block3_conv2 (Conv2D) (None, 56, 56, 256) 590080
block3_conv3 (Conv2D) (None, 56, 56, 256) 590080
block3_pool (MaxPooling2D) (None, 28, 28, 256) 0
block4_conv1 (Conv2D) (None, 28, 28, 512) 1180160
block4_conv2 (Conv2D) (None, 28, 28, 512) 2359808
block4_conv3 (Conv2D) (None, 28, 28, 512) 2359808
block4_pool (MaxPooling2D) (None, 14, 14, 512) 0
block5_conv1 (Conv2D) (None, 14, 14, 512) 2359808
block5_conv2 (Conv2D) (None, 14, 14, 512) 2359808
block5_conv3 (Conv2D) (None, 14, 14, 512) 2359808
block5_pool (MaxPooling2D) (None, 7, 7, 512) 0
flatten (Flatten) (None, 25088) 0
fc1 (Dense) (None, 4096) 102764544
fc2 (Dense) (None, 4096) 16781312
predictions (Dense) (None, 1000) 4097000
=================================================================
Total params: 138,357,544
Trainable params: 138,357,544
Non-trainable params: 0
_________________________________________________________________
From the above, we can see that the example VGG16 model has 138 million parameters, with a 224 x 224 input layer. We shall simplify the model, and reduce the number of parameters to make it suitable for our 31 x 31 and 128 x 128 input images.
What is InceptionNet?
The architecture of InceptionNet V3 includes an input layer, followed by a series of Inception modules, auxiliary classifiers for training, and ends with fully connected layers. This structure allows it to manage a more diverse range of filters simultaneously, enhancing its capability to recognize a wide array of features in images. While it maintains a high level of accuracy, Inception V3 is also optimized for efficient computation, making it a robust choice for challenging image recognition tasks. The model is known for its performance on ImageNet and other benchmark datasets, demonstrating its effectiveness in complex object detection and classification scenarios.
A typical InceptionNet V3 architecture consists of grid size reduction modules bewteen each version of "Inception Module". For our use case, we shall remove these grid-size reductions, given the already small size of our images, at 128 x 128, and 31 x 31, compared to the original 299 x 299 which InceptionNet V3 was created and optimized for.
inceptionModel = tf.keras.applications.InceptionV3(
include_top=True,
weights="imagenet",
input_tensor=None,
input_shape=None,
pooling=None,
classes=1000,
classifier_activation="softmax",
)
inceptionModel.summary()
Model: "inception_v3"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_16 (InputLayer) [(None, 299, 299, 3 0 []
)]
conv2d_199 (Conv2D) (None, 149, 149, 32 864 ['input_16[0][0]']
)
batch_normalization_157 (Batch (None, 149, 149, 32 96 ['conv2d_199[0][0]']
Normalization) )
activation (Activation) (None, 149, 149, 32 0 ['batch_normalization_157[0][0]']
)
conv2d_200 (Conv2D) (None, 147, 147, 32 9216 ['activation[0][0]']
)
batch_normalization_158 (Batch (None, 147, 147, 32 96 ['conv2d_200[0][0]']
Normalization) )
activation_1 (Activation) (None, 147, 147, 32 0 ['batch_normalization_158[0][0]']
)
conv2d_201 (Conv2D) (None, 147, 147, 64 18432 ['activation_1[0][0]']
)
batch_normalization_159 (Batch (None, 147, 147, 64 192 ['conv2d_201[0][0]']
Normalization) )
activation_2 (Activation) (None, 147, 147, 64 0 ['batch_normalization_159[0][0]']
)
max_pooling2d_81 (MaxPooling2D (None, 73, 73, 64) 0 ['activation_2[0][0]']
)
conv2d_202 (Conv2D) (None, 73, 73, 80) 5120 ['max_pooling2d_81[0][0]']
batch_normalization_160 (Batch (None, 73, 73, 80) 240 ['conv2d_202[0][0]']
Normalization)
activation_3 (Activation) (None, 73, 73, 80) 0 ['batch_normalization_160[0][0]']
conv2d_203 (Conv2D) (None, 71, 71, 192) 138240 ['activation_3[0][0]']
batch_normalization_161 (Batch (None, 71, 71, 192) 576 ['conv2d_203[0][0]']
Normalization)
activation_4 (Activation) (None, 71, 71, 192) 0 ['batch_normalization_161[0][0]']
max_pooling2d_82 (MaxPooling2D (None, 35, 35, 192) 0 ['activation_4[0][0]']
)
conv2d_207 (Conv2D) (None, 35, 35, 64) 12288 ['max_pooling2d_82[0][0]']
batch_normalization_165 (Batch (None, 35, 35, 64) 192 ['conv2d_207[0][0]']
Normalization)
activation_8 (Activation) (None, 35, 35, 64) 0 ['batch_normalization_165[0][0]']
conv2d_205 (Conv2D) (None, 35, 35, 48) 9216 ['max_pooling2d_82[0][0]']
conv2d_208 (Conv2D) (None, 35, 35, 96) 55296 ['activation_8[0][0]']
batch_normalization_163 (Batch (None, 35, 35, 48) 144 ['conv2d_205[0][0]']
Normalization)
batch_normalization_166 (Batch (None, 35, 35, 96) 288 ['conv2d_208[0][0]']
Normalization)
activation_6 (Activation) (None, 35, 35, 48) 0 ['batch_normalization_163[0][0]']
activation_9 (Activation) (None, 35, 35, 96) 0 ['batch_normalization_166[0][0]']
average_pooling2d_7 (AveragePo (None, 35, 35, 192) 0 ['max_pooling2d_82[0][0]']
oling2D)
conv2d_204 (Conv2D) (None, 35, 35, 64) 12288 ['max_pooling2d_82[0][0]']
conv2d_206 (Conv2D) (None, 35, 35, 64) 76800 ['activation_6[0][0]']
conv2d_209 (Conv2D) (None, 35, 35, 96) 82944 ['activation_9[0][0]']
conv2d_210 (Conv2D) (None, 35, 35, 32) 6144 ['average_pooling2d_7[0][0]']
batch_normalization_162 (Batch (None, 35, 35, 64) 192 ['conv2d_204[0][0]']
Normalization)
batch_normalization_164 (Batch (None, 35, 35, 64) 192 ['conv2d_206[0][0]']
Normalization)
batch_normalization_167 (Batch (None, 35, 35, 96) 288 ['conv2d_209[0][0]']
Normalization)
batch_normalization_168 (Batch (None, 35, 35, 32) 96 ['conv2d_210[0][0]']
Normalization)
activation_5 (Activation) (None, 35, 35, 64) 0 ['batch_normalization_162[0][0]']
activation_7 (Activation) (None, 35, 35, 64) 0 ['batch_normalization_164[0][0]']
activation_10 (Activation) (None, 35, 35, 96) 0 ['batch_normalization_167[0][0]']
activation_11 (Activation) (None, 35, 35, 32) 0 ['batch_normalization_168[0][0]']
mixed0 (Concatenate) (None, 35, 35, 256) 0 ['activation_5[0][0]',
'activation_7[0][0]',
'activation_10[0][0]',
'activation_11[0][0]']
conv2d_214 (Conv2D) (None, 35, 35, 64) 16384 ['mixed0[0][0]']
batch_normalization_172 (Batch (None, 35, 35, 64) 192 ['conv2d_214[0][0]']
Normalization)
activation_15 (Activation) (None, 35, 35, 64) 0 ['batch_normalization_172[0][0]']
conv2d_212 (Conv2D) (None, 35, 35, 48) 12288 ['mixed0[0][0]']
conv2d_215 (Conv2D) (None, 35, 35, 96) 55296 ['activation_15[0][0]']
batch_normalization_170 (Batch (None, 35, 35, 48) 144 ['conv2d_212[0][0]']
Normalization)
batch_normalization_173 (Batch (None, 35, 35, 96) 288 ['conv2d_215[0][0]']
Normalization)
activation_13 (Activation) (None, 35, 35, 48) 0 ['batch_normalization_170[0][0]']
activation_16 (Activation) (None, 35, 35, 96) 0 ['batch_normalization_173[0][0]']
average_pooling2d_8 (AveragePo (None, 35, 35, 256) 0 ['mixed0[0][0]']
oling2D)
conv2d_211 (Conv2D) (None, 35, 35, 64) 16384 ['mixed0[0][0]']
conv2d_213 (Conv2D) (None, 35, 35, 64) 76800 ['activation_13[0][0]']
conv2d_216 (Conv2D) (None, 35, 35, 96) 82944 ['activation_16[0][0]']
conv2d_217 (Conv2D) (None, 35, 35, 64) 16384 ['average_pooling2d_8[0][0]']
batch_normalization_169 (Batch (None, 35, 35, 64) 192 ['conv2d_211[0][0]']
Normalization)
batch_normalization_171 (Batch (None, 35, 35, 64) 192 ['conv2d_213[0][0]']
Normalization)
batch_normalization_174 (Batch (None, 35, 35, 96) 288 ['conv2d_216[0][0]']
Normalization)
batch_normalization_175 (Batch (None, 35, 35, 64) 192 ['conv2d_217[0][0]']
Normalization)
activation_12 (Activation) (None, 35, 35, 64) 0 ['batch_normalization_169[0][0]']
activation_14 (Activation) (None, 35, 35, 64) 0 ['batch_normalization_171[0][0]']
activation_17 (Activation) (None, 35, 35, 96) 0 ['batch_normalization_174[0][0]']
activation_18 (Activation) (None, 35, 35, 64) 0 ['batch_normalization_175[0][0]']
mixed1 (Concatenate) (None, 35, 35, 288) 0 ['activation_12[0][0]',
'activation_14[0][0]',
'activation_17[0][0]',
'activation_18[0][0]']
conv2d_221 (Conv2D) (None, 35, 35, 64) 18432 ['mixed1[0][0]']
batch_normalization_179 (Batch (None, 35, 35, 64) 192 ['conv2d_221[0][0]']
Normalization)
activation_22 (Activation) (None, 35, 35, 64) 0 ['batch_normalization_179[0][0]']
conv2d_219 (Conv2D) (None, 35, 35, 48) 13824 ['mixed1[0][0]']
conv2d_222 (Conv2D) (None, 35, 35, 96) 55296 ['activation_22[0][0]']
batch_normalization_177 (Batch (None, 35, 35, 48) 144 ['conv2d_219[0][0]']
Normalization)
batch_normalization_180 (Batch (None, 35, 35, 96) 288 ['conv2d_222[0][0]']
Normalization)
activation_20 (Activation) (None, 35, 35, 48) 0 ['batch_normalization_177[0][0]']
activation_23 (Activation) (None, 35, 35, 96) 0 ['batch_normalization_180[0][0]']
average_pooling2d_9 (AveragePo (None, 35, 35, 288) 0 ['mixed1[0][0]']
oling2D)
conv2d_218 (Conv2D) (None, 35, 35, 64) 18432 ['mixed1[0][0]']
conv2d_220 (Conv2D) (None, 35, 35, 64) 76800 ['activation_20[0][0]']
conv2d_223 (Conv2D) (None, 35, 35, 96) 82944 ['activation_23[0][0]']
conv2d_224 (Conv2D) (None, 35, 35, 64) 18432 ['average_pooling2d_9[0][0]']
batch_normalization_176 (Batch (None, 35, 35, 64) 192 ['conv2d_218[0][0]']
Normalization)
batch_normalization_178 (Batch (None, 35, 35, 64) 192 ['conv2d_220[0][0]']
Normalization)
batch_normalization_181 (Batch (None, 35, 35, 96) 288 ['conv2d_223[0][0]']
Normalization)
batch_normalization_182 (Batch (None, 35, 35, 64) 192 ['conv2d_224[0][0]']
Normalization)
activation_19 (Activation) (None, 35, 35, 64) 0 ['batch_normalization_176[0][0]']
activation_21 (Activation) (None, 35, 35, 64) 0 ['batch_normalization_178[0][0]']
activation_24 (Activation) (None, 35, 35, 96) 0 ['batch_normalization_181[0][0]']
activation_25 (Activation) (None, 35, 35, 64) 0 ['batch_normalization_182[0][0]']
mixed2 (Concatenate) (None, 35, 35, 288) 0 ['activation_19[0][0]',
'activation_21[0][0]',
'activation_24[0][0]',
'activation_25[0][0]']
conv2d_226 (Conv2D) (None, 35, 35, 64) 18432 ['mixed2[0][0]']
batch_normalization_184 (Batch (None, 35, 35, 64) 192 ['conv2d_226[0][0]']
Normalization)
activation_27 (Activation) (None, 35, 35, 64) 0 ['batch_normalization_184[0][0]']
conv2d_227 (Conv2D) (None, 35, 35, 96) 55296 ['activation_27[0][0]']
batch_normalization_185 (Batch (None, 35, 35, 96) 288 ['conv2d_227[0][0]']
Normalization)
activation_28 (Activation) (None, 35, 35, 96) 0 ['batch_normalization_185[0][0]']
conv2d_225 (Conv2D) (None, 17, 17, 384) 995328 ['mixed2[0][0]']
conv2d_228 (Conv2D) (None, 17, 17, 96) 82944 ['activation_28[0][0]']
batch_normalization_183 (Batch (None, 17, 17, 384) 1152 ['conv2d_225[0][0]']
Normalization)
batch_normalization_186 (Batch (None, 17, 17, 96) 288 ['conv2d_228[0][0]']
Normalization)
activation_26 (Activation) (None, 17, 17, 384) 0 ['batch_normalization_183[0][0]']
activation_29 (Activation) (None, 17, 17, 96) 0 ['batch_normalization_186[0][0]']
max_pooling2d_83 (MaxPooling2D (None, 17, 17, 288) 0 ['mixed2[0][0]']
)
mixed3 (Concatenate) (None, 17, 17, 768) 0 ['activation_26[0][0]',
'activation_29[0][0]',
'max_pooling2d_83[0][0]']
conv2d_233 (Conv2D) (None, 17, 17, 128) 98304 ['mixed3[0][0]']
batch_normalization_191 (Batch (None, 17, 17, 128) 384 ['conv2d_233[0][0]']
Normalization)
activation_34 (Activation) (None, 17, 17, 128) 0 ['batch_normalization_191[0][0]']
conv2d_234 (Conv2D) (None, 17, 17, 128) 114688 ['activation_34[0][0]']
batch_normalization_192 (Batch (None, 17, 17, 128) 384 ['conv2d_234[0][0]']
Normalization)
activation_35 (Activation) (None, 17, 17, 128) 0 ['batch_normalization_192[0][0]']
conv2d_230 (Conv2D) (None, 17, 17, 128) 98304 ['mixed3[0][0]']
conv2d_235 (Conv2D) (None, 17, 17, 128) 114688 ['activation_35[0][0]']
batch_normalization_188 (Batch (None, 17, 17, 128) 384 ['conv2d_230[0][0]']
Normalization)
batch_normalization_193 (Batch (None, 17, 17, 128) 384 ['conv2d_235[0][0]']
Normalization)
activation_31 (Activation) (None, 17, 17, 128) 0 ['batch_normalization_188[0][0]']
activation_36 (Activation) (None, 17, 17, 128) 0 ['batch_normalization_193[0][0]']
conv2d_231 (Conv2D) (None, 17, 17, 128) 114688 ['activation_31[0][0]']
conv2d_236 (Conv2D) (None, 17, 17, 128) 114688 ['activation_36[0][0]']
batch_normalization_189 (Batch (None, 17, 17, 128) 384 ['conv2d_231[0][0]']
Normalization)
batch_normalization_194 (Batch (None, 17, 17, 128) 384 ['conv2d_236[0][0]']
Normalization)
activation_32 (Activation) (None, 17, 17, 128) 0 ['batch_normalization_189[0][0]']
activation_37 (Activation) (None, 17, 17, 128) 0 ['batch_normalization_194[0][0]']
average_pooling2d_10 (AverageP (None, 17, 17, 768) 0 ['mixed3[0][0]']
ooling2D)
conv2d_229 (Conv2D) (None, 17, 17, 192) 147456 ['mixed3[0][0]']
conv2d_232 (Conv2D) (None, 17, 17, 192) 172032 ['activation_32[0][0]']
conv2d_237 (Conv2D) (None, 17, 17, 192) 172032 ['activation_37[0][0]']
conv2d_238 (Conv2D) (None, 17, 17, 192) 147456 ['average_pooling2d_10[0][0]']
batch_normalization_187 (Batch (None, 17, 17, 192) 576 ['conv2d_229[0][0]']
Normalization)
batch_normalization_190 (Batch (None, 17, 17, 192) 576 ['conv2d_232[0][0]']
Normalization)
batch_normalization_195 (Batch (None, 17, 17, 192) 576 ['conv2d_237[0][0]']
Normalization)
batch_normalization_196 (Batch (None, 17, 17, 192) 576 ['conv2d_238[0][0]']
Normalization)
activation_30 (Activation) (None, 17, 17, 192) 0 ['batch_normalization_187[0][0]']
activation_33 (Activation) (None, 17, 17, 192) 0 ['batch_normalization_190[0][0]']
activation_38 (Activation) (None, 17, 17, 192) 0 ['batch_normalization_195[0][0]']
activation_39 (Activation) (None, 17, 17, 192) 0 ['batch_normalization_196[0][0]']
mixed4 (Concatenate) (None, 17, 17, 768) 0 ['activation_30[0][0]',
'activation_33[0][0]',
'activation_38[0][0]',
'activation_39[0][0]']
conv2d_243 (Conv2D) (None, 17, 17, 160) 122880 ['mixed4[0][0]']
batch_normalization_201 (Batch (None, 17, 17, 160) 480 ['conv2d_243[0][0]']
Normalization)
activation_44 (Activation) (None, 17, 17, 160) 0 ['batch_normalization_201[0][0]']
conv2d_244 (Conv2D) (None, 17, 17, 160) 179200 ['activation_44[0][0]']
batch_normalization_202 (Batch (None, 17, 17, 160) 480 ['conv2d_244[0][0]']
Normalization)
activation_45 (Activation) (None, 17, 17, 160) 0 ['batch_normalization_202[0][0]']
conv2d_240 (Conv2D) (None, 17, 17, 160) 122880 ['mixed4[0][0]']
conv2d_245 (Conv2D) (None, 17, 17, 160) 179200 ['activation_45[0][0]']
batch_normalization_198 (Batch (None, 17, 17, 160) 480 ['conv2d_240[0][0]']
Normalization)
batch_normalization_203 (Batch (None, 17, 17, 160) 480 ['conv2d_245[0][0]']
Normalization)
activation_41 (Activation) (None, 17, 17, 160) 0 ['batch_normalization_198[0][0]']
activation_46 (Activation) (None, 17, 17, 160) 0 ['batch_normalization_203[0][0]']
conv2d_241 (Conv2D) (None, 17, 17, 160) 179200 ['activation_41[0][0]']
conv2d_246 (Conv2D) (None, 17, 17, 160) 179200 ['activation_46[0][0]']
batch_normalization_199 (Batch (None, 17, 17, 160) 480 ['conv2d_241[0][0]']
Normalization)
batch_normalization_204 (Batch (None, 17, 17, 160) 480 ['conv2d_246[0][0]']
Normalization)
activation_42 (Activation) (None, 17, 17, 160) 0 ['batch_normalization_199[0][0]']
activation_47 (Activation) (None, 17, 17, 160) 0 ['batch_normalization_204[0][0]']
average_pooling2d_11 (AverageP (None, 17, 17, 768) 0 ['mixed4[0][0]']
ooling2D)
conv2d_239 (Conv2D) (None, 17, 17, 192) 147456 ['mixed4[0][0]']
conv2d_242 (Conv2D) (None, 17, 17, 192) 215040 ['activation_42[0][0]']
conv2d_247 (Conv2D) (None, 17, 17, 192) 215040 ['activation_47[0][0]']
conv2d_248 (Conv2D) (None, 17, 17, 192) 147456 ['average_pooling2d_11[0][0]']
batch_normalization_197 (Batch (None, 17, 17, 192) 576 ['conv2d_239[0][0]']
Normalization)
batch_normalization_200 (Batch (None, 17, 17, 192) 576 ['conv2d_242[0][0]']
Normalization)
batch_normalization_205 (Batch (None, 17, 17, 192) 576 ['conv2d_247[0][0]']
Normalization)
batch_normalization_206 (Batch (None, 17, 17, 192) 576 ['conv2d_248[0][0]']
Normalization)
activation_40 (Activation) (None, 17, 17, 192) 0 ['batch_normalization_197[0][0]']
activation_43 (Activation) (None, 17, 17, 192) 0 ['batch_normalization_200[0][0]']
activation_48 (Activation) (None, 17, 17, 192) 0 ['batch_normalization_205[0][0]']
activation_49 (Activation) (None, 17, 17, 192) 0 ['batch_normalization_206[0][0]']
mixed5 (Concatenate) (None, 17, 17, 768) 0 ['activation_40[0][0]',
'activation_43[0][0]',
'activation_48[0][0]',
'activation_49[0][0]']
conv2d_253 (Conv2D) (None, 17, 17, 160) 122880 ['mixed5[0][0]']
batch_normalization_211 (Batch (None, 17, 17, 160) 480 ['conv2d_253[0][0]']
Normalization)
activation_54 (Activation) (None, 17, 17, 160) 0 ['batch_normalization_211[0][0]']
conv2d_254 (Conv2D) (None, 17, 17, 160) 179200 ['activation_54[0][0]']
batch_normalization_212 (Batch (None, 17, 17, 160) 480 ['conv2d_254[0][0]']
Normalization)
activation_55 (Activation) (None, 17, 17, 160) 0 ['batch_normalization_212[0][0]']
conv2d_250 (Conv2D) (None, 17, 17, 160) 122880 ['mixed5[0][0]']
conv2d_255 (Conv2D) (None, 17, 17, 160) 179200 ['activation_55[0][0]']
batch_normalization_208 (Batch (None, 17, 17, 160) 480 ['conv2d_250[0][0]']
Normalization)
batch_normalization_213 (Batch (None, 17, 17, 160) 480 ['conv2d_255[0][0]']
Normalization)
activation_51 (Activation) (None, 17, 17, 160) 0 ['batch_normalization_208[0][0]']
activation_56 (Activation) (None, 17, 17, 160) 0 ['batch_normalization_213[0][0]']
conv2d_251 (Conv2D) (None, 17, 17, 160) 179200 ['activation_51[0][0]']
conv2d_256 (Conv2D) (None, 17, 17, 160) 179200 ['activation_56[0][0]']
batch_normalization_209 (Batch (None, 17, 17, 160) 480 ['conv2d_251[0][0]']
Normalization)
batch_normalization_214 (Batch (None, 17, 17, 160) 480 ['conv2d_256[0][0]']
Normalization)
activation_52 (Activation) (None, 17, 17, 160) 0 ['batch_normalization_209[0][0]']
activation_57 (Activation) (None, 17, 17, 160) 0 ['batch_normalization_214[0][0]']
average_pooling2d_12 (AverageP (None, 17, 17, 768) 0 ['mixed5[0][0]']
ooling2D)
conv2d_249 (Conv2D) (None, 17, 17, 192) 147456 ['mixed5[0][0]']
conv2d_252 (Conv2D) (None, 17, 17, 192) 215040 ['activation_52[0][0]']
conv2d_257 (Conv2D) (None, 17, 17, 192) 215040 ['activation_57[0][0]']
conv2d_258 (Conv2D) (None, 17, 17, 192) 147456 ['average_pooling2d_12[0][0]']
batch_normalization_207 (Batch (None, 17, 17, 192) 576 ['conv2d_249[0][0]']
Normalization)
batch_normalization_210 (Batch (None, 17, 17, 192) 576 ['conv2d_252[0][0]']
Normalization)
batch_normalization_215 (Batch (None, 17, 17, 192) 576 ['conv2d_257[0][0]']
Normalization)
batch_normalization_216 (Batch (None, 17, 17, 192) 576 ['conv2d_258[0][0]']
Normalization)
activation_50 (Activation) (None, 17, 17, 192) 0 ['batch_normalization_207[0][0]']
activation_53 (Activation) (None, 17, 17, 192) 0 ['batch_normalization_210[0][0]']
activation_58 (Activation) (None, 17, 17, 192) 0 ['batch_normalization_215[0][0]']
activation_59 (Activation) (None, 17, 17, 192) 0 ['batch_normalization_216[0][0]']
mixed6 (Concatenate) (None, 17, 17, 768) 0 ['activation_50[0][0]',
'activation_53[0][0]',
'activation_58[0][0]',
'activation_59[0][0]']
conv2d_263 (Conv2D) (None, 17, 17, 192) 147456 ['mixed6[0][0]']
batch_normalization_221 (Batch (None, 17, 17, 192) 576 ['conv2d_263[0][0]']
Normalization)
activation_64 (Activation) (None, 17, 17, 192) 0 ['batch_normalization_221[0][0]']
conv2d_264 (Conv2D) (None, 17, 17, 192) 258048 ['activation_64[0][0]']
batch_normalization_222 (Batch (None, 17, 17, 192) 576 ['conv2d_264[0][0]']
Normalization)
activation_65 (Activation) (None, 17, 17, 192) 0 ['batch_normalization_222[0][0]']
conv2d_260 (Conv2D) (None, 17, 17, 192) 147456 ['mixed6[0][0]']
conv2d_265 (Conv2D) (None, 17, 17, 192) 258048 ['activation_65[0][0]']
batch_normalization_218 (Batch (None, 17, 17, 192) 576 ['conv2d_260[0][0]']
Normalization)
batch_normalization_223 (Batch (None, 17, 17, 192) 576 ['conv2d_265[0][0]']
Normalization)
activation_61 (Activation) (None, 17, 17, 192) 0 ['batch_normalization_218[0][0]']
activation_66 (Activation) (None, 17, 17, 192) 0 ['batch_normalization_223[0][0]']
conv2d_261 (Conv2D) (None, 17, 17, 192) 258048 ['activation_61[0][0]']
conv2d_266 (Conv2D) (None, 17, 17, 192) 258048 ['activation_66[0][0]']
batch_normalization_219 (Batch (None, 17, 17, 192) 576 ['conv2d_261[0][0]']
Normalization)
batch_normalization_224 (Batch (None, 17, 17, 192) 576 ['conv2d_266[0][0]']
Normalization)
activation_62 (Activation) (None, 17, 17, 192) 0 ['batch_normalization_219[0][0]']
activation_67 (Activation) (None, 17, 17, 192) 0 ['batch_normalization_224[0][0]']
average_pooling2d_13 (AverageP (None, 17, 17, 768) 0 ['mixed6[0][0]']
ooling2D)
conv2d_259 (Conv2D) (None, 17, 17, 192) 147456 ['mixed6[0][0]']
conv2d_262 (Conv2D) (None, 17, 17, 192) 258048 ['activation_62[0][0]']
conv2d_267 (Conv2D) (None, 17, 17, 192) 258048 ['activation_67[0][0]']
conv2d_268 (Conv2D) (None, 17, 17, 192) 147456 ['average_pooling2d_13[0][0]']
batch_normalization_217 (Batch (None, 17, 17, 192) 576 ['conv2d_259[0][0]']
Normalization)
batch_normalization_220 (Batch (None, 17, 17, 192) 576 ['conv2d_262[0][0]']
Normalization)
batch_normalization_225 (Batch (None, 17, 17, 192) 576 ['conv2d_267[0][0]']
Normalization)
batch_normalization_226 (Batch (None, 17, 17, 192) 576 ['conv2d_268[0][0]']
Normalization)
activation_60 (Activation) (None, 17, 17, 192) 0 ['batch_normalization_217[0][0]']
activation_63 (Activation) (None, 17, 17, 192) 0 ['batch_normalization_220[0][0]']
activation_68 (Activation) (None, 17, 17, 192) 0 ['batch_normalization_225[0][0]']
activation_69 (Activation) (None, 17, 17, 192) 0 ['batch_normalization_226[0][0]']
mixed7 (Concatenate) (None, 17, 17, 768) 0 ['activation_60[0][0]',
'activation_63[0][0]',
'activation_68[0][0]',
'activation_69[0][0]']
conv2d_271 (Conv2D) (None, 17, 17, 192) 147456 ['mixed7[0][0]']
batch_normalization_229 (Batch (None, 17, 17, 192) 576 ['conv2d_271[0][0]']
Normalization)
activation_72 (Activation) (None, 17, 17, 192) 0 ['batch_normalization_229[0][0]']
conv2d_272 (Conv2D) (None, 17, 17, 192) 258048 ['activation_72[0][0]']
batch_normalization_230 (Batch (None, 17, 17, 192) 576 ['conv2d_272[0][0]']
Normalization)
activation_73 (Activation) (None, 17, 17, 192) 0 ['batch_normalization_230[0][0]']
conv2d_269 (Conv2D) (None, 17, 17, 192) 147456 ['mixed7[0][0]']
conv2d_273 (Conv2D) (None, 17, 17, 192) 258048 ['activation_73[0][0]']
batch_normalization_227 (Batch (None, 17, 17, 192) 576 ['conv2d_269[0][0]']
Normalization)
batch_normalization_231 (Batch (None, 17, 17, 192) 576 ['conv2d_273[0][0]']
Normalization)
activation_70 (Activation) (None, 17, 17, 192) 0 ['batch_normalization_227[0][0]']
activation_74 (Activation) (None, 17, 17, 192) 0 ['batch_normalization_231[0][0]']
conv2d_270 (Conv2D) (None, 8, 8, 320) 552960 ['activation_70[0][0]']
conv2d_274 (Conv2D) (None, 8, 8, 192) 331776 ['activation_74[0][0]']
batch_normalization_228 (Batch (None, 8, 8, 320) 960 ['conv2d_270[0][0]']
Normalization)
batch_normalization_232 (Batch (None, 8, 8, 192) 576 ['conv2d_274[0][0]']
Normalization)
activation_71 (Activation) (None, 8, 8, 320) 0 ['batch_normalization_228[0][0]']
activation_75 (Activation) (None, 8, 8, 192) 0 ['batch_normalization_232[0][0]']
max_pooling2d_84 (MaxPooling2D (None, 8, 8, 768) 0 ['mixed7[0][0]']
)
mixed8 (Concatenate) (None, 8, 8, 1280) 0 ['activation_71[0][0]',
'activation_75[0][0]',
'max_pooling2d_84[0][0]']
conv2d_279 (Conv2D) (None, 8, 8, 448) 573440 ['mixed8[0][0]']
batch_normalization_237 (Batch (None, 8, 8, 448) 1344 ['conv2d_279[0][0]']
Normalization)
activation_80 (Activation) (None, 8, 8, 448) 0 ['batch_normalization_237[0][0]']
conv2d_276 (Conv2D) (None, 8, 8, 384) 491520 ['mixed8[0][0]']
conv2d_280 (Conv2D) (None, 8, 8, 384) 1548288 ['activation_80[0][0]']
batch_normalization_234 (Batch (None, 8, 8, 384) 1152 ['conv2d_276[0][0]']
Normalization)
batch_normalization_238 (Batch (None, 8, 8, 384) 1152 ['conv2d_280[0][0]']
Normalization)
activation_77 (Activation) (None, 8, 8, 384) 0 ['batch_normalization_234[0][0]']
activation_81 (Activation) (None, 8, 8, 384) 0 ['batch_normalization_238[0][0]']
conv2d_277 (Conv2D) (None, 8, 8, 384) 442368 ['activation_77[0][0]']
conv2d_278 (Conv2D) (None, 8, 8, 384) 442368 ['activation_77[0][0]']
conv2d_281 (Conv2D) (None, 8, 8, 384) 442368 ['activation_81[0][0]']
conv2d_282 (Conv2D) (None, 8, 8, 384) 442368 ['activation_81[0][0]']
average_pooling2d_14 (AverageP (None, 8, 8, 1280) 0 ['mixed8[0][0]']
ooling2D)
conv2d_275 (Conv2D) (None, 8, 8, 320) 409600 ['mixed8[0][0]']
batch_normalization_235 (Batch (None, 8, 8, 384) 1152 ['conv2d_277[0][0]']
Normalization)
batch_normalization_236 (Batch (None, 8, 8, 384) 1152 ['conv2d_278[0][0]']
Normalization)
batch_normalization_239 (Batch (None, 8, 8, 384) 1152 ['conv2d_281[0][0]']
Normalization)
batch_normalization_240 (Batch (None, 8, 8, 384) 1152 ['conv2d_282[0][0]']
Normalization)
conv2d_283 (Conv2D) (None, 8, 8, 192) 245760 ['average_pooling2d_14[0][0]']
batch_normalization_233 (Batch (None, 8, 8, 320) 960 ['conv2d_275[0][0]']
Normalization)
activation_78 (Activation) (None, 8, 8, 384) 0 ['batch_normalization_235[0][0]']
activation_79 (Activation) (None, 8, 8, 384) 0 ['batch_normalization_236[0][0]']
activation_82 (Activation) (None, 8, 8, 384) 0 ['batch_normalization_239[0][0]']
activation_83 (Activation) (None, 8, 8, 384) 0 ['batch_normalization_240[0][0]']
batch_normalization_241 (Batch (None, 8, 8, 192) 576 ['conv2d_283[0][0]']
Normalization)
activation_76 (Activation) (None, 8, 8, 320) 0 ['batch_normalization_233[0][0]']
mixed9_0 (Concatenate) (None, 8, 8, 768) 0 ['activation_78[0][0]',
'activation_79[0][0]']
concatenate (Concatenate) (None, 8, 8, 768) 0 ['activation_82[0][0]',
'activation_83[0][0]']
activation_84 (Activation) (None, 8, 8, 192) 0 ['batch_normalization_241[0][0]']
mixed9 (Concatenate) (None, 8, 8, 2048) 0 ['activation_76[0][0]',
'mixed9_0[0][0]',
'concatenate[0][0]',
'activation_84[0][0]']
conv2d_288 (Conv2D) (None, 8, 8, 448) 917504 ['mixed9[0][0]']
batch_normalization_246 (Batch (None, 8, 8, 448) 1344 ['conv2d_288[0][0]']
Normalization)
activation_89 (Activation) (None, 8, 8, 448) 0 ['batch_normalization_246[0][0]']
conv2d_285 (Conv2D) (None, 8, 8, 384) 786432 ['mixed9[0][0]']
conv2d_289 (Conv2D) (None, 8, 8, 384) 1548288 ['activation_89[0][0]']
batch_normalization_243 (Batch (None, 8, 8, 384) 1152 ['conv2d_285[0][0]']
Normalization)
batch_normalization_247 (Batch (None, 8, 8, 384) 1152 ['conv2d_289[0][0]']
Normalization)
activation_86 (Activation) (None, 8, 8, 384) 0 ['batch_normalization_243[0][0]']
activation_90 (Activation) (None, 8, 8, 384) 0 ['batch_normalization_247[0][0]']
conv2d_286 (Conv2D) (None, 8, 8, 384) 442368 ['activation_86[0][0]']
conv2d_287 (Conv2D) (None, 8, 8, 384) 442368 ['activation_86[0][0]']
conv2d_290 (Conv2D) (None, 8, 8, 384) 442368 ['activation_90[0][0]']
conv2d_291 (Conv2D) (None, 8, 8, 384) 442368 ['activation_90[0][0]']
average_pooling2d_15 (AverageP (None, 8, 8, 2048) 0 ['mixed9[0][0]']
ooling2D)
conv2d_284 (Conv2D) (None, 8, 8, 320) 655360 ['mixed9[0][0]']
batch_normalization_244 (Batch (None, 8, 8, 384) 1152 ['conv2d_286[0][0]']
Normalization)
batch_normalization_245 (Batch (None, 8, 8, 384) 1152 ['conv2d_287[0][0]']
Normalization)
batch_normalization_248 (Batch (None, 8, 8, 384) 1152 ['conv2d_290[0][0]']
Normalization)
batch_normalization_249 (Batch (None, 8, 8, 384) 1152 ['conv2d_291[0][0]']
Normalization)
conv2d_292 (Conv2D) (None, 8, 8, 192) 393216 ['average_pooling2d_15[0][0]']
batch_normalization_242 (Batch (None, 8, 8, 320) 960 ['conv2d_284[0][0]']
Normalization)
activation_87 (Activation) (None, 8, 8, 384) 0 ['batch_normalization_244[0][0]']
activation_88 (Activation) (None, 8, 8, 384) 0 ['batch_normalization_245[0][0]']
activation_91 (Activation) (None, 8, 8, 384) 0 ['batch_normalization_248[0][0]']
activation_92 (Activation) (None, 8, 8, 384) 0 ['batch_normalization_249[0][0]']
batch_normalization_250 (Batch (None, 8, 8, 192) 576 ['conv2d_292[0][0]']
Normalization)
activation_85 (Activation) (None, 8, 8, 320) 0 ['batch_normalization_242[0][0]']
mixed9_1 (Concatenate) (None, 8, 8, 768) 0 ['activation_87[0][0]',
'activation_88[0][0]']
concatenate_1 (Concatenate) (None, 8, 8, 768) 0 ['activation_91[0][0]',
'activation_92[0][0]']
activation_93 (Activation) (None, 8, 8, 192) 0 ['batch_normalization_250[0][0]']
mixed10 (Concatenate) (None, 8, 8, 2048) 0 ['activation_85[0][0]',
'mixed9_1[0][0]',
'concatenate_1[0][0]',
'activation_93[0][0]']
avg_pool (GlobalAveragePooling (None, 2048) 0 ['mixed10[0][0]']
2D)
predictions (Dense) (None, 1000) 2049000 ['avg_pool[0][0]']
==================================================================================================
Total params: 23,851,784
Trainable params: 23,817,352
Non-trainable params: 34,432
__________________________________________________________________________________________________
The above summary tells us about the format of InceptionNet v3, showing how it uses BatchNormalization and Activation layers after each convolutional layer. We shall take note of this, and apply it later on. In addition, we shall also simplify the InceptionNet model, given that it was optimized for 299 x 299 images, and not the 128 x 128 and 31 x 31 images of our dataset.
def vgg_block(n_conv, n_filter, pooling = True):
block = Sequential()
for _ in range (n_conv):
block.add(Conv2D(n_filter, kernel_size=3, padding='same', activation='relu'))
block.add(BatchNormalization())
block.add(ReLU())
if pooling:
block.add(MaxPooling2D(pool_size=2, strides=2))
return block
Here, we make a simpler model based on the VGG architecture. We make use of 10 Convolutional Layers, and 3 Fully Connected layers, making it a VGG-13 model. In addition, instead of using a 1000 channel softmax layer, we use a 15 channel softmax layer to match the output of our dataset.
res31_vggmodel = Sequential()
res31_vggmodel.add(Input(shape = (31, 31, 1)))
res31_vggmodel.add(vgg_block(2, 32))
res31_vggmodel.add(vgg_block(2, 64))
res31_vggmodel.add(vgg_block(3, 128))
res31_vggmodel.add(vgg_block(3, 256))
res31_vggmodel.add(Flatten())
res31_vggmodel.add(Dense(4096, activation = "relu"))
res31_vggmodel.add(Dense(4096, activation = "relu"))
res31_vggmodel.add(ActivityRegularization(l1 = 0.0001, l2 = 0.0001))
res31_vggmodel.add(Dense(num_classes, activation = "softmax"))
res31_vggmodel.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy'])
history = res31_vggmodel.fit(std_res31_train, validation_data = std_res31_validation, epochs = 150, callbacks = earlyStopper)
cnnf.model_metric_graph(history, res31_vggmodel, std_res31_validation, name = "adapted VGG-13", class_names = class_names)
Epoch 1/150 283/283 [==============================] - 9s 25ms/step - loss: 2.4111 - accuracy: 0.2172 - val_loss: 2.9013 - val_accuracy: 0.0813 Epoch 2/150 283/283 [==============================] - 7s 25ms/step - loss: 1.7566 - accuracy: 0.4350 - val_loss: 1.9467 - val_accuracy: 0.3463 Epoch 3/150 283/283 [==============================] - 7s 24ms/step - loss: 1.3811 - accuracy: 0.5624 - val_loss: 1.5058 - val_accuracy: 0.5510 Epoch 4/150 283/283 [==============================] - 8s 26ms/step - loss: 1.0904 - accuracy: 0.6615 - val_loss: 1.4079 - val_accuracy: 0.5523 Epoch 5/150 283/283 [==============================] - 7s 25ms/step - loss: 0.8906 - accuracy: 0.7370 - val_loss: 1.1108 - val_accuracy: 0.6773 Epoch 6/150 283/283 [==============================] - 7s 24ms/step - loss: 0.7268 - accuracy: 0.7836 - val_loss: 1.1410 - val_accuracy: 0.6817 Epoch 7/150 283/283 [==============================] - 8s 25ms/step - loss: 0.5670 - accuracy: 0.8365 - val_loss: 0.7813 - val_accuracy: 0.7803 Epoch 8/150 283/283 [==============================] - 7s 25ms/step - loss: 0.4718 - accuracy: 0.8658 - val_loss: 0.7053 - val_accuracy: 0.8083 Epoch 9/150 283/283 [==============================] - 7s 25ms/step - loss: 0.3791 - accuracy: 0.8916 - val_loss: 1.0165 - val_accuracy: 0.7303 Epoch 10/150 283/283 [==============================] - 7s 24ms/step - loss: 0.3587 - accuracy: 0.8998 - val_loss: 0.6048 - val_accuracy: 0.8360 Epoch 11/150 283/283 [==============================] - 7s 24ms/step - loss: 0.3008 - accuracy: 0.9208 - val_loss: 0.6989 - val_accuracy: 0.8320 Epoch 12/150 283/283 [==============================] - 7s 25ms/step - loss: 0.2578 - accuracy: 0.9302 - val_loss: 0.7067 - val_accuracy: 0.8177 Epoch 13/150 283/283 [==============================] - 7s 24ms/step - loss: 0.2345 - accuracy: 0.9343 - val_loss: 0.6134 - val_accuracy: 0.8400 Epoch 14/150 283/283 [==============================] - 7s 24ms/step - loss: 0.1990 - accuracy: 0.9467 - val_loss: 0.8285 - val_accuracy: 0.8103 Epoch 15/150 283/283 [==============================] - 7s 24ms/step - loss: 0.1759 - accuracy: 0.9548 - val_loss: 0.5408 - val_accuracy: 0.8577 Epoch 16/150 283/283 [==============================] - 9s 29ms/step - loss: 0.1414 - accuracy: 0.9620 - val_loss: 0.5555 - val_accuracy: 0.8747 Epoch 17/150 283/283 [==============================] - 8s 28ms/step - loss: 0.1526 - accuracy: 0.9597 - val_loss: 0.5036 - val_accuracy: 0.8693 Epoch 18/150 283/283 [==============================] - 8s 27ms/step - loss: 0.1705 - accuracy: 0.9554 - val_loss: 0.4093 - val_accuracy: 0.8953 Epoch 19/150 283/283 [==============================] - 8s 25ms/step - loss: 0.1204 - accuracy: 0.9704 - val_loss: 0.3586 - val_accuracy: 0.9217 Epoch 20/150 283/283 [==============================] - 8s 26ms/step - loss: 0.1165 - accuracy: 0.9684 - val_loss: 0.5269 - val_accuracy: 0.8737 Epoch 21/150 283/283 [==============================] - 8s 26ms/step - loss: 0.1183 - accuracy: 0.9710 - val_loss: 0.5190 - val_accuracy: 0.8837 Epoch 22/150 283/283 [==============================] - 8s 29ms/step - loss: 0.1477 - accuracy: 0.9626 - val_loss: 0.4237 - val_accuracy: 0.8927 Epoch 23/150 283/283 [==============================] - 8s 26ms/step - loss: 0.1293 - accuracy: 0.9671 - val_loss: 0.4876 - val_accuracy: 0.8903 Epoch 24/150 283/283 [==============================] - 8s 27ms/step - loss: 0.1560 - accuracy: 0.9592 - val_loss: 0.4937 - val_accuracy: 0.8860 Epoch 25/150 283/283 [==============================] - 8s 28ms/step - loss: 0.1336 - accuracy: 0.9668 - val_loss: 0.4204 - val_accuracy: 0.8923 Epoch 26/150 283/283 [==============================] - 7s 25ms/step - loss: 0.0763 - accuracy: 0.9818 - val_loss: 0.3732 - val_accuracy: 0.9110 Epoch 27/150 283/283 [==============================] - 8s 26ms/step - loss: 0.0749 - accuracy: 0.9825 - val_loss: 0.4512 - val_accuracy: 0.8963 Epoch 28/150 283/283 [==============================] - 7s 25ms/step - loss: 0.0717 - accuracy: 0.9828 - val_loss: 0.4216 - val_accuracy: 0.9067 Epoch 29/150 283/283 [==============================] - 7s 25ms/step - loss: 0.0731 - accuracy: 0.9811 - val_loss: 0.4964 - val_accuracy: 0.8863 Epoch 30/150 283/283 [==============================] - 7s 25ms/step - loss: 0.1283 - accuracy: 0.9668 - val_loss: 0.5020 - val_accuracy: 0.8883 Epoch 31/150 283/283 [==============================] - 7s 25ms/step - loss: 0.0681 - accuracy: 0.9838 - val_loss: 0.4358 - val_accuracy: 0.9120 Epoch 32/150 283/283 [==============================] - 7s 24ms/step - loss: 0.1151 - accuracy: 0.9711 - val_loss: 0.5898 - val_accuracy: 0.8790 Epoch 33/150 283/283 [==============================] - 8s 26ms/step - loss: 0.0729 - accuracy: 0.9829 - val_loss: 0.4331 - val_accuracy: 0.9047 Epoch 34/150 283/283 [==============================] - 8s 26ms/step - loss: 0.0736 - accuracy: 0.9824 - val_loss: 0.4451 - val_accuracy: 0.9130 Epoch 35/150 283/283 [==============================] - 7s 24ms/step - loss: 0.0541 - accuracy: 0.9875 - val_loss: 0.5297 - val_accuracy: 0.8847 Epoch 36/150 283/283 [==============================] - 7s 25ms/step - loss: 0.0573 - accuracy: 0.9865 - val_loss: 0.6814 - val_accuracy: 0.8583 Epoch 37/150 283/283 [==============================] - 7s 25ms/step - loss: 0.0711 - accuracy: 0.9855 - val_loss: 0.3943 - val_accuracy: 0.9147 Epoch 38/150 283/283 [==============================] - 7s 24ms/step - loss: 0.0511 - accuracy: 0.9888 - val_loss: 0.3811 - val_accuracy: 0.9183 Epoch 39/150 283/283 [==============================] - 7s 25ms/step - loss: 0.0931 - accuracy: 0.9763 - val_loss: 0.4722 - val_accuracy: 0.8983 Epoch 40/150 283/283 [==============================] - 8s 29ms/step - loss: 0.0804 - accuracy: 0.9804 - val_loss: 0.4319 - val_accuracy: 0.9040 Epoch 41/150 283/283 [==============================] - 8s 27ms/step - loss: 0.0623 - accuracy: 0.9862 - val_loss: 0.4914 - val_accuracy: 0.8903 Epoch 42/150 283/283 [==============================] - 8s 27ms/step - loss: 0.0852 - accuracy: 0.9802 - val_loss: 0.4446 - val_accuracy: 0.9070 Epoch 43/150 283/283 [==============================] - 8s 27ms/step - loss: 0.0448 - accuracy: 0.9906 - val_loss: 0.4817 - val_accuracy: 0.9033 Epoch 44/150 283/283 [==============================] - 7s 25ms/step - loss: 0.0564 - accuracy: 0.9872 - val_loss: 0.4102 - val_accuracy: 0.9153 Epoch 45/150 283/283 [==============================] - 8s 26ms/step - loss: 0.1262 - accuracy: 0.9679 - val_loss: 0.4463 - val_accuracy: 0.8913 Epoch 46/150 283/283 [==============================] - 8s 26ms/step - loss: 0.0392 - accuracy: 0.9898 - val_loss: 0.3822 - val_accuracy: 0.9283 Epoch 47/150 283/283 [==============================] - 8s 26ms/step - loss: 0.0274 - accuracy: 0.9946 - val_loss: 0.5049 - val_accuracy: 0.9100 Epoch 48/150 283/283 [==============================] - 7s 24ms/step - loss: 0.0448 - accuracy: 0.9900 - val_loss: 0.4839 - val_accuracy: 0.8983 Epoch 49/150 283/283 [==============================] - 7s 24ms/step - loss: 0.0579 - accuracy: 0.9860 - val_loss: 0.4841 - val_accuracy: 0.8953
res31_vggmodel.summary()
Model: "sequential_90"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
sequential_91 (Sequential) (None, 15, 15, 32) 9824
sequential_92 (Sequential) (None, 7, 7, 64) 55936
sequential_93 (Sequential) (None, 3, 3, 128) 370560
sequential_94 (Sequential) (None, 1, 1, 256) 1478400
flatten_19 (Flatten) (None, 256) 0
dense_55 (Dense) (None, 4096) 1052672
dense_56 (Dense) (None, 4096) 16781312
activity_regularization_19 (None, 4096) 0
(ActivityRegularization)
dense_57 (Dense) (None, 15) 61455
=================================================================
Total params: 19,810,159
Trainable params: 19,807,471
Non-trainable params: 2,688
_________________________________________________________________
tf.keras.utils.plot_model(res31_vggmodel, show_shapes=True, dpi=80)
From the above, we can see that the VGG architecture performs quite well on our dataset. Now we shall try to create a InceptionNet model, and define the Inception Modules as well.
Defining "Inception Modules" for use later.
"inceptionBlock1" follows this architecture:
class inceptionBlock1(Layer):
def __init__(self, num_filters, **kwargs):
super(inceptionBlock1, self).__init__(**kwargs)
self.num_filters = num_filters
def build(self, input_shape):
self.tower_1_conv1 = Conv2D(self.num_filters, (1, 1), padding='same', activation='relu')
self.tower_1_conv2 = Conv2D(self.num_filters, (3, 3), padding='same', activation='relu')
self.tower_1_conv3 = Conv2D(self.num_filters, (3, 3), padding='same', activation='relu')
self.tower_2_conv1 = Conv2D(self.num_filters, (1, 1), padding='same', activation='relu')
self.tower_2_conv2 = Conv2D(self.num_filters, (3, 3), padding='same', activation='relu')
self.tower_3_pool = AveragePooling2D((3, 3), strides=(1, 1), padding='same')
self.tower_3_conv1 = Conv2D(self.num_filters, (1, 1), padding='same', activation='relu')
self.tower_4_conv1 = Conv2D(self.num_filters, (1, 1), padding='same', activation='relu')
def call(self, inputs):
tower_1 = self.tower_1_conv1(inputs)
tower_1 = self.tower_1_conv2(tower_1)
tower_1 = self.tower_1_conv3(tower_1)
tower_2 = self.tower_2_conv1(inputs)
tower_2 = self.tower_2_conv2(tower_2)
tower_3 = self.tower_3_pool(inputs)
tower_3 = self.tower_3_conv1(tower_3)
tower_4 = self.tower_4_conv1(inputs)
output = concatenate([tower_1, tower_2, tower_3, tower_4], axis=3)
return output
"inceptionBlock2" follows this architecture:
note: for our use, n shall be set at 3, representing a 3 x 3 factorized convolution.
class inceptionBlock2(Layer):
def __init__(self, num_filters, n, **kwargs):
super(inceptionBlock2, self).__init__(**kwargs)
self.num_filters = num_filters
self.n = n
def build(self, input_shape):
self.tower_1_conv1 = Conv2D(self.num_filters, (1, 1), padding='same', activation='relu')
self.tower_1_conv2 = Conv2D(self.num_filters, (1, self.n), padding='same', activation='relu')
self.tower_1_conv3 = Conv2D(self.num_filters, (self.n, 1), padding='same', activation='relu')
self.tower_1_conv4 = Conv2D(self.num_filters, (1, self.n), padding='same', activation='relu')
self.tower_1_conv5 = Conv2D(self.num_filters, (self.n, 1), padding='same', activation='relu')
self.tower_2_conv1 = Conv2D(self.num_filters, (1, 1), padding='same', activation='relu')
self.tower_2_conv2 = Conv2D(self.num_filters, (1, self.n), padding='same', activation='relu')
self.tower_2_conv3 = Conv2D(self.num_filters, (self.n, 1), padding='same', activation='relu')
self.tower_3_pool = AveragePooling2D((3, 3), strides=(1, 1), padding='same')
self.tower_3_conv1 = Conv2D(self.num_filters, (1, 1), padding='same', activation='relu')
self.tower_4_conv1 = Conv2D(self.num_filters, (1, 1), padding='same', activation='relu')
def call(self, inputs):
tower_1 = self.tower_1_conv1(inputs)
tower_1 = self.tower_1_conv2(tower_1)
tower_1 = self.tower_1_conv3(tower_1)
tower_1 = self.tower_1_conv4(tower_1)
tower_1 = self.tower_1_conv5(tower_1)
tower_2 = self.tower_2_conv1(inputs)
tower_2 = self.tower_2_conv2(tower_2)
tower_2 = self.tower_2_conv3(tower_2)
tower_3 = self.tower_3_pool(inputs)
tower_3 = self.tower_3_conv1(tower_3)
tower_4 = self.tower_4_conv1(inputs)
output = concatenate([tower_1, tower_2, tower_3, tower_4], axis=3)
return output
"inceptionBlock3" follows this architecture:
class inceptionBlock3(Layer):
def __init__(self, num_filters, **kwargs):
super(inceptionBlock3, self).__init__(**kwargs)
self.num_filters = num_filters
def build(self, input_shape):
self.branch1_conv1 = Conv2D(self.num_filters, (1, 1), padding='same', activation='relu')
self.branch2_conv1 = Conv2D(self.num_filters, (1, 1), padding='same', activation='relu')
self.branch2_conv2 = Conv2D(self.num_filters, (1, 3), padding='same', activation='relu')
self.branch2_conv3 = Conv2D(self.num_filters, (3, 1), padding='same', activation='relu')
self.branch3_conv1 = Conv2D(self.num_filters, (1, 1), padding='same', activation='relu')
self.branch3_conv2 = Conv2D(self.num_filters, (3, 3), padding='same', activation='relu')
self.branch3_conv3 = Conv2D(self.num_filters, (1, 3), padding='same', activation='relu')
self.branch3_conv4 = Conv2D(self.num_filters, (3, 1), padding='same', activation='relu')
self.branch4_pool = AveragePooling2D((3, 3), strides=(1, 1), padding='same')
self.branch4_conv1 = Conv2D(self.num_filters, (1, 1), padding='same', activation='relu')
def call(self, inputs):
branch1 = self.branch1_conv1(inputs)
branch2 = self.branch2_conv1(inputs)
branch2_1x3 = self.branch2_conv2(branch2)
branch2_3x1 = self.branch2_conv3(branch2)
branch3 = self.branch3_conv1(inputs)
branch3 = self.branch3_conv2(branch3)
branch3_1x3 = self.branch3_conv3(branch3)
branch3_3x1 = self.branch3_conv4(branch3)
branch4 = self.branch4_pool(inputs)
branch4 = self.branch4_conv1(branch4)
output = concatenate([branch1, branch2_1x3, branch2_3x1, branch3_1x3, branch3_3x1, branch4], axis=-1)
return output
Here, we define a simpler version of the InceptionNet v3 model. Instead of using a inception block layout of 3-5-2, we shall use a simple layour of 2-3-1. We also remove two more convolutional layers to simplify it further, but add a regularization layer to stabilize the model.
res31_inceptionmodel = Sequential()
res31_inceptionmodel.add(Conv2D(64, kernel_size = (3,3), strides = (2,1), padding = "same", activation = "relu"))
res31_inceptionmodel.add(Conv2D(64, kernel_size = (3,3), strides = (1,1), padding = "valid", activation = "relu"))
res31_inceptionmodel.add(MaxPooling2D((2,2)))
res31_inceptionmodel.add(Conv2D(64, kernel_size = (3,3), strides = (1,1), padding = "same", activation = "relu"))
res31_inceptionmodel.add(Conv2D(64, kernel_size = (3,3), strides = (1,1), padding = "same", activation = "relu"))
res31_inceptionmodel.add(inceptionBlock1(64))
res31_inceptionmodel.add(inceptionBlock1(64))
res31_inceptionmodel.add(inceptionBlock2(num_filters = 64, n = 3))
res31_inceptionmodel.add(inceptionBlock2(num_filters = 64, n = 3))
res31_inceptionmodel.add(inceptionBlock2(num_filters = 64, n = 3))
res31_inceptionmodel.add(inceptionBlock3(num_filters = 64))
res31_inceptionmodel.add(AveragePooling2D())
res31_inceptionmodel.add(Dropout(0.2))
res31_inceptionmodel.add(Dense(2048, activation = "relu"))
res31_inceptionmodel.add(ActivityRegularization(l1 = 0.0005, l2 = 0.0002))
res31_inceptionmodel.add(Flatten()) #added to allow model to work
res31_inceptionmodel.add(Dense(num_classes, activation = "softmax"))
res31_inceptionmodel.compile(optimizer='rmsprop', loss='sparse_categorical_crossentropy', metrics=['accuracy'])
history = res31_inceptionmodel.fit(std_res31_train, validation_data = std_res31_validation, epochs = 150, callbacks = earlyStopper)
cnnf.model_metric_graph(history, res31_inceptionmodel, std_res31_validation, name = "adapted InceptionNet v3", class_names = class_names)
Epoch 1/150 283/283 [==============================] - 20s 49ms/step - loss: 2.5695 - accuracy: 0.1500 - val_loss: 2.5545 - val_accuracy: 0.1227 Epoch 2/150 283/283 [==============================] - 13s 46ms/step - loss: 2.1079 - accuracy: 0.3210 - val_loss: 2.1597 - val_accuracy: 0.3043 Epoch 3/150 283/283 [==============================] - 13s 47ms/step - loss: 1.7902 - accuracy: 0.4403 - val_loss: 3.3991 - val_accuracy: 0.2017 Epoch 4/150 283/283 [==============================] - 14s 48ms/step - loss: 1.5317 - accuracy: 0.5246 - val_loss: 2.6475 - val_accuracy: 0.3013 Epoch 5/150 283/283 [==============================] - 14s 48ms/step - loss: 1.3162 - accuracy: 0.5961 - val_loss: 1.2645 - val_accuracy: 0.6080 Epoch 6/150 283/283 [==============================] - 13s 46ms/step - loss: 1.1083 - accuracy: 0.6577 - val_loss: 1.2395 - val_accuracy: 0.6470 Epoch 7/150 283/283 [==============================] - 13s 46ms/step - loss: 0.9549 - accuracy: 0.7119 - val_loss: 2.3906 - val_accuracy: 0.5113 Epoch 8/150 283/283 [==============================] - 13s 46ms/step - loss: 0.8167 - accuracy: 0.7622 - val_loss: 1.2736 - val_accuracy: 0.6290 Epoch 9/150 283/283 [==============================] - 14s 48ms/step - loss: 0.6860 - accuracy: 0.7987 - val_loss: 2.2882 - val_accuracy: 0.5150 Epoch 10/150 283/283 [==============================] - 14s 48ms/step - loss: 0.5921 - accuracy: 0.8292 - val_loss: 0.9844 - val_accuracy: 0.7477 Epoch 11/150 283/283 [==============================] - 13s 46ms/step - loss: 0.5078 - accuracy: 0.8571 - val_loss: 1.0955 - val_accuracy: 0.6730 Epoch 12/150 283/283 [==============================] - 13s 45ms/step - loss: 0.4403 - accuracy: 0.8768 - val_loss: 0.9864 - val_accuracy: 0.7353 Epoch 13/150 283/283 [==============================] - 14s 48ms/step - loss: 0.3940 - accuracy: 0.8949 - val_loss: 1.4728 - val_accuracy: 0.6500 Epoch 14/150 283/283 [==============================] - 14s 48ms/step - loss: 0.3401 - accuracy: 0.9122 - val_loss: 0.9465 - val_accuracy: 0.7473 Epoch 15/150 283/283 [==============================] - 14s 47ms/step - loss: 0.2990 - accuracy: 0.9243 - val_loss: 1.2581 - val_accuracy: 0.7013 Epoch 16/150 283/283 [==============================] - 13s 46ms/step - loss: 0.2790 - accuracy: 0.9279 - val_loss: 0.7852 - val_accuracy: 0.8050 Epoch 17/150 283/283 [==============================] - 13s 46ms/step - loss: 0.2454 - accuracy: 0.9406 - val_loss: 0.7924 - val_accuracy: 0.8127 Epoch 18/150 283/283 [==============================] - 14s 48ms/step - loss: 0.2421 - accuracy: 0.9427 - val_loss: 0.7448 - val_accuracy: 0.8257 Epoch 19/150 283/283 [==============================] - 14s 48ms/step - loss: 0.2201 - accuracy: 0.9489 - val_loss: 1.1154 - val_accuracy: 0.7400 Epoch 20/150 283/283 [==============================] - 14s 48ms/step - loss: 0.2030 - accuracy: 0.9555 - val_loss: 0.7546 - val_accuracy: 0.8320 Epoch 21/150 283/283 [==============================] - 13s 47ms/step - loss: 0.1911 - accuracy: 0.9580 - val_loss: 0.8820 - val_accuracy: 0.7953 Epoch 22/150 283/283 [==============================] - 14s 47ms/step - loss: 0.1837 - accuracy: 0.9579 - val_loss: 1.5478 - val_accuracy: 0.6967 Epoch 23/150 283/283 [==============================] - 14s 47ms/step - loss: 0.1799 - accuracy: 0.9581 - val_loss: 0.7243 - val_accuracy: 0.8307 Epoch 24/150 283/283 [==============================] - 14s 48ms/step - loss: 0.1578 - accuracy: 0.9653 - val_loss: 0.7525 - val_accuracy: 0.8267 Epoch 25/150 283/283 [==============================] - 13s 46ms/step - loss: 0.1518 - accuracy: 0.9687 - val_loss: 0.9391 - val_accuracy: 0.8047 Epoch 26/150 283/283 [==============================] - 14s 47ms/step - loss: 0.1516 - accuracy: 0.9675 - val_loss: 0.8988 - val_accuracy: 0.7977 Epoch 27/150 283/283 [==============================] - 14s 47ms/step - loss: 0.1376 - accuracy: 0.9713 - val_loss: 0.6745 - val_accuracy: 0.8440 Epoch 28/150 283/283 [==============================] - 13s 46ms/step - loss: 0.1327 - accuracy: 0.9715 - val_loss: 0.7117 - val_accuracy: 0.8397 Epoch 29/150 283/283 [==============================] - 14s 47ms/step - loss: 0.1310 - accuracy: 0.9713 - val_loss: 0.7146 - val_accuracy: 0.8437 Epoch 30/150 283/283 [==============================] - 13s 46ms/step - loss: 0.1267 - accuracy: 0.9735 - val_loss: 0.7260 - val_accuracy: 0.8400 Epoch 31/150 283/283 [==============================] - 13s 46ms/step - loss: 0.1163 - accuracy: 0.9780 - val_loss: 0.8078 - val_accuracy: 0.8183 Epoch 32/150 283/283 [==============================] - 13s 46ms/step - loss: 0.1270 - accuracy: 0.9747 - val_loss: 1.0474 - val_accuracy: 0.8053 Epoch 33/150 283/283 [==============================] - 13s 46ms/step - loss: 0.1192 - accuracy: 0.9743 - val_loss: 0.6132 - val_accuracy: 0.8587 Epoch 34/150 283/283 [==============================] - 14s 47ms/step - loss: 0.1082 - accuracy: 0.9783 - val_loss: 0.8247 - val_accuracy: 0.8193 Epoch 35/150 283/283 [==============================] - 14s 48ms/step - loss: 0.1112 - accuracy: 0.9778 - val_loss: 0.8026 - val_accuracy: 0.8257 Epoch 36/150 283/283 [==============================] - 13s 46ms/step - loss: 0.1132 - accuracy: 0.9772 - val_loss: 0.6939 - val_accuracy: 0.8470 Epoch 37/150 283/283 [==============================] - 14s 47ms/step - loss: 0.1097 - accuracy: 0.9781 - val_loss: 0.6901 - val_accuracy: 0.8417 Epoch 38/150 283/283 [==============================] - 14s 48ms/step - loss: 0.1064 - accuracy: 0.9791 - val_loss: 0.7908 - val_accuracy: 0.8353 Epoch 39/150 283/283 [==============================] - 13s 46ms/step - loss: 0.0979 - accuracy: 0.9804 - val_loss: 0.6912 - val_accuracy: 0.8513 Epoch 40/150 283/283 [==============================] - 13s 46ms/step - loss: 0.1076 - accuracy: 0.9790 - val_loss: 0.7810 - val_accuracy: 0.8340 Epoch 41/150 283/283 [==============================] - 14s 47ms/step - loss: 0.1018 - accuracy: 0.9786 - val_loss: 0.7506 - val_accuracy: 0.8447 Epoch 42/150 283/283 [==============================] - 14s 48ms/step - loss: 0.0938 - accuracy: 0.9817 - val_loss: 0.7034 - val_accuracy: 0.8517 Epoch 43/150 283/283 [==============================] - 14s 48ms/step - loss: 0.0985 - accuracy: 0.9801 - val_loss: 0.7050 - val_accuracy: 0.8477 Epoch 44/150 283/283 [==============================] - 13s 46ms/step - loss: 0.0970 - accuracy: 0.9797 - val_loss: 0.6531 - val_accuracy: 0.8567 Epoch 45/150 283/283 [==============================] - 13s 47ms/step - loss: 0.0978 - accuracy: 0.9774 - val_loss: 0.7542 - val_accuracy: 0.8450 Epoch 46/150 283/283 [==============================] - 14s 47ms/step - loss: 0.0914 - accuracy: 0.9829 - val_loss: 1.1763 - val_accuracy: 0.7930 Epoch 47/150 283/283 [==============================] - 14s 47ms/step - loss: 0.0888 - accuracy: 0.9823 - val_loss: 0.8132 - val_accuracy: 0.8477 Epoch 48/150 283/283 [==============================] - 14s 48ms/step - loss: 0.0952 - accuracy: 0.9815 - val_loss: 1.5317 - val_accuracy: 0.7497 Epoch 49/150 283/283 [==============================] - 13s 47ms/step - loss: 0.0927 - accuracy: 0.9808 - val_loss: 0.7616 - val_accuracy: 0.8283 Epoch 50/150 283/283 [==============================] - 14s 47ms/step - loss: 0.0840 - accuracy: 0.9823 - val_loss: 0.7469 - val_accuracy: 0.8513 Epoch 51/150 283/283 [==============================] - 14s 47ms/step - loss: 0.0951 - accuracy: 0.9812 - val_loss: 0.8766 - val_accuracy: 0.8267 Epoch 52/150 283/283 [==============================] - 14s 48ms/step - loss: 0.0906 - accuracy: 0.9806 - val_loss: 0.6838 - val_accuracy: 0.8580 Epoch 53/150 283/283 [==============================] - 13s 47ms/step - loss: 0.0909 - accuracy: 0.9806 - val_loss: 0.7922 - val_accuracy: 0.8383 Epoch 54/150 283/283 [==============================] - 13s 46ms/step - loss: 0.0920 - accuracy: 0.9812 - val_loss: 0.7282 - val_accuracy: 0.8497 Epoch 55/150 283/283 [==============================] - 14s 47ms/step - loss: 0.0856 - accuracy: 0.9842 - val_loss: 1.9478 - val_accuracy: 0.6517 Epoch 56/150 283/283 [==============================] - 14s 47ms/step - loss: 0.0960 - accuracy: 0.9811 - val_loss: 0.7762 - val_accuracy: 0.8300 Epoch 57/150 283/283 [==============================] - 14s 47ms/step - loss: 0.0867 - accuracy: 0.9822 - val_loss: 1.0144 - val_accuracy: 0.8083 Epoch 58/150 283/283 [==============================] - 14s 47ms/step - loss: 0.0893 - accuracy: 0.9843 - val_loss: 0.6928 - val_accuracy: 0.8500 Epoch 59/150 283/283 [==============================] - 13s 46ms/step - loss: 0.0795 - accuracy: 0.9833 - val_loss: 0.7947 - val_accuracy: 0.8407 Epoch 60/150 283/283 [==============================] - 14s 47ms/step - loss: 0.0825 - accuracy: 0.9842 - val_loss: 0.6766 - val_accuracy: 0.8613 Epoch 61/150 283/283 [==============================] - 14s 48ms/step - loss: 0.0918 - accuracy: 0.9818 - val_loss: 0.7583 - val_accuracy: 0.8503 Epoch 62/150 283/283 [==============================] - 13s 45ms/step - loss: 0.0834 - accuracy: 0.9813 - val_loss: 0.7093 - val_accuracy: 0.8480 Epoch 63/150 283/283 [==============================] - 13s 45ms/step - loss: 0.0767 - accuracy: 0.9854 - val_loss: 0.7206 - val_accuracy: 0.8510
res31_inceptionmodel.summary()
Model: "sequential_95"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d_303 (Conv2D) (None, 16, 31, 64) 640
conv2d_304 (Conv2D) (None, 14, 29, 64) 36928
max_pooling2d_89 (MaxPoolin (None, 7, 14, 64) 0
g2D)
conv2d_305 (Conv2D) (None, 7, 14, 64) 36928
conv2d_306 (Conv2D) (None, 7, 14, 64) 36928
inception_block1_20 (incept (None, 7, 14, 256) 127424
ionBlock1)
inception_block1_21 (incept (None, 7, 14, 256) 176576
ionBlock1)
inception_block2_33 (incept (None, 7, 14, 256) 139904
ionBlock2)
inception_block2_34 (incept (None, 7, 14, 256) 139904
ionBlock2)
inception_block2_35 (incept (None, 7, 14, 256) 139904
ionBlock2)
inception_block3_13 (incept (None, 7, 14, 384) 152128
ionBlock3)
average_pooling2d_16 (Avera (None, 3, 7, 384) 0
gePooling2D)
dropout_7 (Dropout) (None, 3, 7, 384) 0
dense_58 (Dense) (None, 3, 7, 2048) 788480
activity_regularization_20 (None, 3, 7, 2048) 0
(ActivityRegularization)
flatten_20 (Flatten) (None, 43008) 0
dense_59 (Dense) (None, 15) 645135
=================================================================
Total params: 2,420,879
Trainable params: 2,420,879
Non-trainable params: 0
_________________________________________________________________
tf.keras.utils.plot_model(res31_inceptionmodel, show_shapes=True, dpi=80)
finalres31_model_list = [res31_vggmodel, res31_inceptionmodel, baseline31luminosity_model]
finalres31_data_list = [std_res31_validation, std_res31_validation, res31luminosity_validation_data]
name_list = ["Modified VGG-13", "Modified Inception v3", "Baseline CNN"]
cnnf.model_comparison(finalres31_model_list, finalres31_data_list, name_list)
94/94 [==============================] - 1s 11ms/step - loss: 0.4841 - accuracy: 0.8953 94/94 [==============================] - 2s 14ms/step - loss: 0.7206 - accuracy: 0.8510 94/94 [==============================] - 1s 9ms/step - loss: 1.2177 - accuracy: 0.6267
From the above, we can see that the VGG-13 model performs the best, better than the baseline CNN model we used, and much better than the InceptionNet v3 model that we attempted to use. Hence, from this, we shall tune the VGG-13 model, and improve it for the final model.
Here, we attempt to make a model for our data, based off of the VGG architecture. For this model, we make use of 13 convolutional layers, and 3 Fully Connected (Dense) layers, making it a VGG-16 model. In addition, instead of using a 1000 channel softmax layer, we use a 15 channel softmax layer to match the output of our dataset.
res128_vggmodel = Sequential()
res128_vggmodel.add(Input(shape = (128, 128, 1)))
res128_vggmodel.add(vgg_block(2, 32))
res128_vggmodel.add(vgg_block(2, 64))
res128_vggmodel.add(vgg_block(3, 128))
res128_vggmodel.add(vgg_block(3, 256))
res128_vggmodel.add(vgg_block(3, 256))
res128_vggmodel.add(Dense(4096, activation = "relu"))
res128_vggmodel.add(Dense(4096, activation = "relu"))
res128_vggmodel.add(ActivityRegularization(l1 = 0.00001, l2 = 0.00001))
res128_vggmodel.add(Flatten())
res128_vggmodel.add(Dense(num_classes, activation = "softmax"))
res128_vggmodel.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy'])
history = res128_vggmodel.fit(std_res128_train, validation_data = std_res128_validation, epochs = 150, callbacks = earlyStopper)
cnnf.model_metric_graph(history, res128_vggmodel, std_res128_validation, name = "adapted VGG-16", class_names = class_names)
Epoch 1/150 283/283 [==============================] - 31s 104ms/step - loss: 2.3772 - accuracy: 0.2537 - val_loss: 3.4935 - val_accuracy: 0.0663 Epoch 2/150 283/283 [==============================] - 29s 102ms/step - loss: 1.5532 - accuracy: 0.4900 - val_loss: 2.0382 - val_accuracy: 0.3340 Epoch 3/150 283/283 [==============================] - 29s 102ms/step - loss: 1.2367 - accuracy: 0.6079 - val_loss: 2.6715 - val_accuracy: 0.3583 Epoch 4/150 283/283 [==============================] - 29s 102ms/step - loss: 0.9627 - accuracy: 0.6995 - val_loss: 1.1481 - val_accuracy: 0.6597 Epoch 5/150 283/283 [==============================] - 29s 102ms/step - loss: 0.7828 - accuracy: 0.7605 - val_loss: 2.3542 - val_accuracy: 0.5103 Epoch 6/150 283/283 [==============================] - 29s 102ms/step - loss: 0.6334 - accuracy: 0.8022 - val_loss: 1.2860 - val_accuracy: 0.6233 Epoch 7/150 283/283 [==============================] - 29s 102ms/step - loss: 0.5528 - accuracy: 0.8295 - val_loss: 0.9168 - val_accuracy: 0.7477 Epoch 8/150 283/283 [==============================] - 29s 102ms/step - loss: 0.4464 - accuracy: 0.8623 - val_loss: 1.0062 - val_accuracy: 0.6990 Epoch 9/150 283/283 [==============================] - 29s 102ms/step - loss: 0.3805 - accuracy: 0.8818 - val_loss: 0.7182 - val_accuracy: 0.7990 Epoch 10/150 283/283 [==============================] - 29s 102ms/step - loss: 0.3063 - accuracy: 0.9064 - val_loss: 1.2125 - val_accuracy: 0.7080 Epoch 11/150 283/283 [==============================] - 29s 101ms/step - loss: 0.2997 - accuracy: 0.9119 - val_loss: 0.7861 - val_accuracy: 0.8017 Epoch 12/150 283/283 [==============================] - 29s 102ms/step - loss: 0.2635 - accuracy: 0.9179 - val_loss: 1.2866 - val_accuracy: 0.6813 Epoch 13/150 283/283 [==============================] - 29s 102ms/step - loss: 0.2219 - accuracy: 0.9293 - val_loss: 0.7728 - val_accuracy: 0.7880 Epoch 14/150 283/283 [==============================] - 29s 102ms/step - loss: 0.2383 - accuracy: 0.9297 - val_loss: 1.0445 - val_accuracy: 0.7543 Epoch 15/150 283/283 [==============================] - 29s 102ms/step - loss: 0.2030 - accuracy: 0.9369 - val_loss: 0.3699 - val_accuracy: 0.8977 Epoch 16/150 283/283 [==============================] - 29s 102ms/step - loss: 0.1616 - accuracy: 0.9504 - val_loss: 0.4077 - val_accuracy: 0.8910 Epoch 17/150 283/283 [==============================] - 29s 102ms/step - loss: 0.1639 - accuracy: 0.9518 - val_loss: 1.0256 - val_accuracy: 0.7683 Epoch 18/150 283/283 [==============================] - 29s 101ms/step - loss: 0.1467 - accuracy: 0.9572 - val_loss: 0.3607 - val_accuracy: 0.8957 Epoch 19/150 283/283 [==============================] - 29s 102ms/step - loss: 0.1225 - accuracy: 0.9630 - val_loss: 0.4118 - val_accuracy: 0.8953 Epoch 20/150 283/283 [==============================] - 29s 101ms/step - loss: 0.1781 - accuracy: 0.9451 - val_loss: 0.3018 - val_accuracy: 0.9213 Epoch 21/150 283/283 [==============================] - 29s 101ms/step - loss: 0.1208 - accuracy: 0.9641 - val_loss: 0.4128 - val_accuracy: 0.8870 Epoch 22/150 283/283 [==============================] - 29s 101ms/step - loss: 0.1338 - accuracy: 0.9589 - val_loss: 0.2639 - val_accuracy: 0.9310 Epoch 23/150 283/283 [==============================] - 29s 101ms/step - loss: 0.0822 - accuracy: 0.9751 - val_loss: 0.5764 - val_accuracy: 0.8683 Epoch 24/150 283/283 [==============================] - 29s 101ms/step - loss: 0.1204 - accuracy: 0.9658 - val_loss: 0.3724 - val_accuracy: 0.9070 Epoch 25/150 283/283 [==============================] - 29s 102ms/step - loss: 0.0636 - accuracy: 0.9809 - val_loss: 0.2308 - val_accuracy: 0.9370 Epoch 26/150 283/283 [==============================] - 29s 101ms/step - loss: 0.1271 - accuracy: 0.9615 - val_loss: 0.3110 - val_accuracy: 0.9167 Epoch 27/150 283/283 [==============================] - 29s 102ms/step - loss: 0.1188 - accuracy: 0.9664 - val_loss: 0.4177 - val_accuracy: 0.8963 Epoch 28/150 283/283 [==============================] - 29s 102ms/step - loss: 0.1273 - accuracy: 0.9627 - val_loss: 0.2578 - val_accuracy: 0.9320 Epoch 29/150 283/283 [==============================] - 29s 101ms/step - loss: 0.0670 - accuracy: 0.9809 - val_loss: 0.3640 - val_accuracy: 0.9137 Epoch 30/150 283/283 [==============================] - 29s 101ms/step - loss: 0.0848 - accuracy: 0.9744 - val_loss: 0.9929 - val_accuracy: 0.8233 Epoch 31/150 283/283 [==============================] - 29s 102ms/step - loss: 0.0793 - accuracy: 0.9776 - val_loss: 0.2292 - val_accuracy: 0.9400 Epoch 32/150 283/283 [==============================] - 29s 101ms/step - loss: 0.0840 - accuracy: 0.9767 - val_loss: 0.3639 - val_accuracy: 0.9057 Epoch 33/150 283/283 [==============================] - 30s 104ms/step - loss: 0.0897 - accuracy: 0.9722 - val_loss: 0.2834 - val_accuracy: 0.9293 Epoch 34/150 283/283 [==============================] - 30s 105ms/step - loss: 0.0701 - accuracy: 0.9823 - val_loss: 0.2533 - val_accuracy: 0.9430 Epoch 35/150 283/283 [==============================] - 30s 104ms/step - loss: 0.0965 - accuracy: 0.9721 - val_loss: 0.2141 - val_accuracy: 0.9510 Epoch 36/150 283/283 [==============================] - 30s 105ms/step - loss: 0.0533 - accuracy: 0.9870 - val_loss: 0.2108 - val_accuracy: 0.9503 Epoch 37/150 283/283 [==============================] - 30s 106ms/step - loss: 0.0202 - accuracy: 0.9951 - val_loss: 0.2041 - val_accuracy: 0.9530 Epoch 38/150 283/283 [==============================] - 31s 107ms/step - loss: 0.0228 - accuracy: 0.9951 - val_loss: 0.2113 - val_accuracy: 0.9523 Epoch 39/150 283/283 [==============================] - 34s 119ms/step - loss: 0.0506 - accuracy: 0.9853 - val_loss: 0.3593 - val_accuracy: 0.9183 Epoch 40/150 283/283 [==============================] - 33s 115ms/step - loss: 0.0606 - accuracy: 0.9834 - val_loss: 0.2539 - val_accuracy: 0.9433 Epoch 41/150 283/283 [==============================] - 33s 114ms/step - loss: 0.0306 - accuracy: 0.9919 - val_loss: 0.3426 - val_accuracy: 0.9203 Epoch 42/150 283/283 [==============================] - 43s 149ms/step - loss: 0.0601 - accuracy: 0.9825 - val_loss: 0.3262 - val_accuracy: 0.9270 Epoch 43/150 283/283 [==============================] - 44s 153ms/step - loss: 0.0681 - accuracy: 0.9805 - val_loss: 0.3260 - val_accuracy: 0.9297 Epoch 44/150 283/283 [==============================] - 37s 130ms/step - loss: 0.0266 - accuracy: 0.9931 - val_loss: 0.3613 - val_accuracy: 0.9213 Epoch 45/150 283/283 [==============================] - 33s 117ms/step - loss: 0.0453 - accuracy: 0.9863 - val_loss: 0.2679 - val_accuracy: 0.9383 Epoch 46/150 283/283 [==============================] - 31s 110ms/step - loss: 0.0983 - accuracy: 0.9699 - val_loss: 0.4198 - val_accuracy: 0.9053 Epoch 47/150 283/283 [==============================] - 31s 110ms/step - loss: 0.0299 - accuracy: 0.9919 - val_loss: 0.2019 - val_accuracy: 0.9527 Epoch 48/150 283/283 [==============================] - 31s 107ms/step - loss: 0.0292 - accuracy: 0.9932 - val_loss: 0.2078 - val_accuracy: 0.9540 Epoch 49/150 283/283 [==============================] - 31s 108ms/step - loss: 0.0642 - accuracy: 0.9821 - val_loss: 0.2108 - val_accuracy: 0.9540 Epoch 50/150 283/283 [==============================] - 30s 105ms/step - loss: 0.0287 - accuracy: 0.9924 - val_loss: 0.3471 - val_accuracy: 0.9300 Epoch 51/150 283/283 [==============================] - 30s 105ms/step - loss: 0.0246 - accuracy: 0.9952 - val_loss: 0.2624 - val_accuracy: 0.9383 Epoch 52/150 283/283 [==============================] - 31s 107ms/step - loss: 0.0825 - accuracy: 0.9746 - val_loss: 0.2283 - val_accuracy: 0.9390 Epoch 53/150 283/283 [==============================] - 30s 106ms/step - loss: 0.0358 - accuracy: 0.9911 - val_loss: 0.2587 - val_accuracy: 0.9427 Epoch 54/150 283/283 [==============================] - 31s 107ms/step - loss: 0.0538 - accuracy: 0.9854 - val_loss: 0.2013 - val_accuracy: 0.9507 Epoch 55/150 283/283 [==============================] - 35s 122ms/step - loss: 0.0352 - accuracy: 0.9899 - val_loss: 0.1913 - val_accuracy: 0.9553 Epoch 56/150 283/283 [==============================] - 41s 145ms/step - loss: 0.0237 - accuracy: 0.9940 - val_loss: 0.2169 - val_accuracy: 0.9503 Epoch 57/150 283/283 [==============================] - 37s 131ms/step - loss: 0.0254 - accuracy: 0.9935 - val_loss: 0.3058 - val_accuracy: 0.9293 Epoch 58/150 283/283 [==============================] - 34s 120ms/step - loss: 0.0188 - accuracy: 0.9948 - val_loss: 0.2745 - val_accuracy: 0.9393 Epoch 59/150 283/283 [==============================] - 40s 142ms/step - loss: 0.0427 - accuracy: 0.9885 - val_loss: 0.4339 - val_accuracy: 0.9257 Epoch 60/150 283/283 [==============================] - 36s 128ms/step - loss: 0.0686 - accuracy: 0.9825 - val_loss: 0.3260 - val_accuracy: 0.9320 Epoch 61/150 283/283 [==============================] - 31s 107ms/step - loss: 0.0805 - accuracy: 0.9774 - val_loss: 0.2142 - val_accuracy: 0.9520 Epoch 62/150 283/283 [==============================] - 33s 116ms/step - loss: 0.0179 - accuracy: 0.9962 - val_loss: 0.1659 - val_accuracy: 0.9597 Epoch 63/150 283/283 [==============================] - 38s 135ms/step - loss: 0.0104 - accuracy: 0.9980 - val_loss: 0.1550 - val_accuracy: 0.9617 Epoch 64/150 283/283 [==============================] - 42s 149ms/step - loss: 0.0293 - accuracy: 0.9926 - val_loss: 0.1996 - val_accuracy: 0.9517 Epoch 65/150 283/283 [==============================] - 39s 136ms/step - loss: 0.0181 - accuracy: 0.9962 - val_loss: 0.2459 - val_accuracy: 0.9490 Epoch 66/150 283/283 [==============================] - 31s 109ms/step - loss: 0.0327 - accuracy: 0.9907 - val_loss: 0.3968 - val_accuracy: 0.9103 Epoch 67/150 283/283 [==============================] - 31s 110ms/step - loss: 0.0409 - accuracy: 0.9883 - val_loss: 0.3194 - val_accuracy: 0.9227 Epoch 68/150 283/283 [==============================] - 35s 122ms/step - loss: 0.0138 - accuracy: 0.9975 - val_loss: 0.1712 - val_accuracy: 0.9610 Epoch 69/150 283/283 [==============================] - 35s 122ms/step - loss: 0.0088 - accuracy: 0.9987 - val_loss: 0.1681 - val_accuracy: 0.9647 Epoch 70/150 283/283 [==============================] - 39s 138ms/step - loss: 0.0081 - accuracy: 0.9989 - val_loss: 0.2022 - val_accuracy: 0.9597 Epoch 71/150 283/283 [==============================] - 32s 113ms/step - loss: 0.0369 - accuracy: 0.9901 - val_loss: 0.6708 - val_accuracy: 0.8647 Epoch 72/150 283/283 [==============================] - 31s 108ms/step - loss: 0.0449 - accuracy: 0.9881 - val_loss: 0.4036 - val_accuracy: 0.9180 Epoch 73/150 283/283 [==============================] - 32s 110ms/step - loss: 0.0227 - accuracy: 0.9937 - val_loss: 0.2781 - val_accuracy: 0.9443 Epoch 74/150 283/283 [==============================] - 32s 112ms/step - loss: 0.0227 - accuracy: 0.9941 - val_loss: 0.3525 - val_accuracy: 0.9267 Epoch 75/150 283/283 [==============================] - 32s 113ms/step - loss: 0.0254 - accuracy: 0.9937 - val_loss: 0.2863 - val_accuracy: 0.9390 Epoch 76/150 283/283 [==============================] - 31s 108ms/step - loss: 0.0367 - accuracy: 0.9885 - val_loss: 0.1747 - val_accuracy: 0.9590 Epoch 77/150 283/283 [==============================] - 32s 112ms/step - loss: 0.0267 - accuracy: 0.9930 - val_loss: 0.2391 - val_accuracy: 0.9537 Epoch 78/150 283/283 [==============================] - 31s 107ms/step - loss: 0.0131 - accuracy: 0.9966 - val_loss: 0.1610 - val_accuracy: 0.9597 Epoch 79/150 283/283 [==============================] - 31s 109ms/step - loss: 0.0155 - accuracy: 0.9972 - val_loss: 0.3580 - val_accuracy: 0.9193 Epoch 80/150 283/283 [==============================] - 31s 109ms/step - loss: 0.0206 - accuracy: 0.9951 - val_loss: 0.2150 - val_accuracy: 0.9577 Epoch 81/150 283/283 [==============================] - 32s 113ms/step - loss: 0.0233 - accuracy: 0.9934 - val_loss: 0.4451 - val_accuracy: 0.9150 Epoch 82/150 283/283 [==============================] - 32s 113ms/step - loss: 0.0529 - accuracy: 0.9859 - val_loss: 0.2076 - val_accuracy: 0.9533 Epoch 83/150 283/283 [==============================] - 31s 108ms/step - loss: 0.0597 - accuracy: 0.9832 - val_loss: 0.2463 - val_accuracy: 0.9477 Epoch 84/150 283/283 [==============================] - 31s 107ms/step - loss: 0.0212 - accuracy: 0.9945 - val_loss: 0.1705 - val_accuracy: 0.9630 Epoch 85/150 283/283 [==============================] - 34s 117ms/step - loss: 0.0371 - accuracy: 0.9919 - val_loss: 0.1878 - val_accuracy: 0.9587 Epoch 86/150 283/283 [==============================] - 34s 119ms/step - loss: 0.0369 - accuracy: 0.9895 - val_loss: 0.2304 - val_accuracy: 0.9473 Epoch 87/150 283/283 [==============================] - 33s 114ms/step - loss: 0.0094 - accuracy: 0.9978 - val_loss: 0.1986 - val_accuracy: 0.9580 Epoch 88/150 283/283 [==============================] - 32s 112ms/step - loss: 0.0091 - accuracy: 0.9987 - val_loss: 0.1767 - val_accuracy: 0.9610 Epoch 89/150 283/283 [==============================] - 32s 111ms/step - loss: 0.0196 - accuracy: 0.9949 - val_loss: 0.1793 - val_accuracy: 0.9593 Epoch 90/150 283/283 [==============================] - 34s 118ms/step - loss: 0.0092 - accuracy: 0.9984 - val_loss: 0.1914 - val_accuracy: 0.9623 Epoch 91/150 283/283 [==============================] - 39s 137ms/step - loss: 0.0377 - accuracy: 0.9890 - val_loss: 0.2695 - val_accuracy: 0.9463 Epoch 92/150 283/283 [==============================] - 35s 122ms/step - loss: 0.0511 - accuracy: 0.9859 - val_loss: 0.2105 - val_accuracy: 0.9527 Epoch 93/150 283/283 [==============================] - 35s 124ms/step - loss: 0.0328 - accuracy: 0.9912 - val_loss: 0.1726 - val_accuracy: 0.9653
print(res128_vggmodel.summary())
Model: "sequential_96"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
sequential_97 (Sequential) (None, 64, 64, 32) 9824
sequential_98 (Sequential) (None, 32, 32, 64) 55936
sequential_99 (Sequential) (None, 16, 16, 128) 370560
sequential_100 (Sequential) (None, 8, 8, 256) 1478400
sequential_101 (Sequential) (None, 4, 4, 256) 1773312
dense_60 (Dense) (None, 4, 4, 4096) 1052672
dense_61 (Dense) (None, 4, 4, 4096) 16781312
activity_regularization_21 (None, 4, 4, 4096) 0
(ActivityRegularization)
flatten_21 (Flatten) (None, 65536) 0
dense_62 (Dense) (None, 15) 983055
=================================================================
Total params: 22,505,071
Trainable params: 22,500,847
Non-trainable params: 4,224
_________________________________________________________________
None
tf.keras.utils.plot_model(res128_vggmodel, show_shapes=True, dpi=80)
Here, we define a simpler version of the InceptionNet v3 model, using a inception block layout of 3-5-2, We remove one more convolutional layers to simplify it further, but add a regularization layer to stabilize the model. We also implement a dropout layer to prevent combat the model's tendency to overfit.
res128_inceptionmodel = Sequential()
res128_inceptionmodel.add(Conv2D(64, kernel_size = (3,3), strides = 2, padding = "same", activation = "relu"))
res128_inceptionmodel.add(Conv2D(64, kernel_size = (3,3), strides = 1, padding = "same", activation = "relu"))
res128_inceptionmodel.add(MaxPooling2D((2,2)))
res128_inceptionmodel.add(Conv2D(64, kernel_size = (3,3), strides = 1, padding = "same", activation = "relu"))
res128_inceptionmodel.add(Conv2D(64, kernel_size = (3,3), strides = 2, padding = "same", activation = "relu"))
res128_inceptionmodel.add(MaxPooling2D((2,2)))
res128_inceptionmodel.add(inceptionBlock1(64))
res128_inceptionmodel.add(inceptionBlock1(64))
res128_inceptionmodel.add(inceptionBlock1(64))
res128_inceptionmodel.add(inceptionBlock2(num_filters = 64, n = 3))
res128_inceptionmodel.add(inceptionBlock2(num_filters = 64, n = 3))
res128_inceptionmodel.add(inceptionBlock2(num_filters = 64, n = 3))
res128_inceptionmodel.add(inceptionBlock2(num_filters = 64, n = 3))
res128_inceptionmodel.add(inceptionBlock2(num_filters = 64, n = 3))
res128_inceptionmodel.add(inceptionBlock3(num_filters = 64))
res128_inceptionmodel.add(inceptionBlock3(num_filters = 64))
res128_inceptionmodel.add(AveragePooling2D((2,2)))
res128_inceptionmodel.add(Dropout(0.1))
res128_inceptionmodel.add(Dense(2048, activation = "relu"))
res128_inceptionmodel.add(ActivityRegularization(l1 = 0.0005, l2 = 0.0002))
res128_inceptionmodel.add(Flatten())
res128_inceptionmodel.add(Dense(num_classes, activation = "softmax"))
res128_inceptionmodel.compile(optimizer='rmsprop', loss='sparse_categorical_crossentropy', metrics=['accuracy'])
history = res128_inceptionmodel.fit(std_res128_train, validation_data = std_res128_validation, epochs = 150, callbacks = earlyStopper)
cnnf.model_metric_graph(history, res128_inceptionmodel, std_res31_validation, name = "adapted InceptionNet v3", class_names = class_names)
Epoch 1/150 283/283 [==============================] - 35s 88ms/step - loss: 2.6546 - accuracy: 0.1025 - val_loss: 2.7972 - val_accuracy: 0.0667 Epoch 2/150 283/283 [==============================] - 25s 86ms/step - loss: 2.5269 - accuracy: 0.1485 - val_loss: 2.6198 - val_accuracy: 0.1530 Epoch 3/150 283/283 [==============================] - 25s 86ms/step - loss: 2.2093 - accuracy: 0.2829 - val_loss: 2.1044 - val_accuracy: 0.2797 Epoch 4/150 283/283 [==============================] - 24s 85ms/step - loss: 1.8524 - accuracy: 0.4094 - val_loss: 3.3843 - val_accuracy: 0.2267 Epoch 5/150 283/283 [==============================] - 25s 86ms/step - loss: 1.6113 - accuracy: 0.4988 - val_loss: 2.2433 - val_accuracy: 0.3830 Epoch 6/150 283/283 [==============================] - 35s 121ms/step - loss: 1.3612 - accuracy: 0.5773 - val_loss: 2.5690 - val_accuracy: 0.4683 Epoch 7/150 283/283 [==============================] - 34s 120ms/step - loss: 1.1385 - accuracy: 0.6567 - val_loss: 1.3616 - val_accuracy: 0.5973 Epoch 8/150 283/283 [==============================] - 35s 121ms/step - loss: 0.9475 - accuracy: 0.7112 - val_loss: 1.1796 - val_accuracy: 0.6557 Epoch 9/150 283/283 [==============================] - 35s 122ms/step - loss: 0.7960 - accuracy: 0.7656 - val_loss: 3.4560 - val_accuracy: 0.2463 Epoch 10/150 283/283 [==============================] - 35s 120ms/step - loss: 0.6710 - accuracy: 0.8003 - val_loss: 0.8690 - val_accuracy: 0.7510 Epoch 11/150 283/283 [==============================] - 35s 122ms/step - loss: 0.5763 - accuracy: 0.8334 - val_loss: 2.9599 - val_accuracy: 0.4627 Epoch 12/150 283/283 [==============================] - 34s 119ms/step - loss: 0.4907 - accuracy: 0.8654 - val_loss: 0.8416 - val_accuracy: 0.7640 Epoch 13/150 283/283 [==============================] - 34s 119ms/step - loss: 0.4368 - accuracy: 0.8819 - val_loss: 0.5892 - val_accuracy: 0.8433 Epoch 14/150 283/283 [==============================] - 34s 120ms/step - loss: 0.3697 - accuracy: 0.9014 - val_loss: 0.9138 - val_accuracy: 0.7617 Epoch 15/150 283/283 [==============================] - 29s 101ms/step - loss: 0.3228 - accuracy: 0.9169 - val_loss: 1.2821 - val_accuracy: 0.7033 Epoch 16/150 283/283 [==============================] - 27s 93ms/step - loss: 0.3099 - accuracy: 0.9190 - val_loss: 1.0322 - val_accuracy: 0.7300 Epoch 17/150 283/283 [==============================] - 25s 87ms/step - loss: 0.2796 - accuracy: 0.9286 - val_loss: 1.4179 - val_accuracy: 0.7173 Epoch 18/150 283/283 [==============================] - 26s 89ms/step - loss: 0.2518 - accuracy: 0.9358 - val_loss: 0.9803 - val_accuracy: 0.7753 Epoch 19/150 283/283 [==============================] - 26s 90ms/step - loss: 0.2256 - accuracy: 0.9444 - val_loss: 7.0671 - val_accuracy: 0.3923 Epoch 20/150 283/283 [==============================] - 26s 90ms/step - loss: 0.2363 - accuracy: 0.9458 - val_loss: 0.6893 - val_accuracy: 0.8127 Epoch 21/150 283/283 [==============================] - 25s 87ms/step - loss: 0.2179 - accuracy: 0.9486 - val_loss: 0.4290 - val_accuracy: 0.9000 Epoch 22/150 283/283 [==============================] - 25s 86ms/step - loss: 0.1886 - accuracy: 0.9558 - val_loss: 0.8490 - val_accuracy: 0.8257 Epoch 23/150 283/283 [==============================] - 26s 90ms/step - loss: 0.1831 - accuracy: 0.9575 - val_loss: 0.4366 - val_accuracy: 0.8890 Epoch 24/150 283/283 [==============================] - 34s 120ms/step - loss: 0.1667 - accuracy: 0.9615 - val_loss: 0.7386 - val_accuracy: 0.8210 Epoch 25/150 283/283 [==============================] - 34s 119ms/step - loss: 0.1583 - accuracy: 0.9623 - val_loss: 0.4640 - val_accuracy: 0.8840 Epoch 26/150 283/283 [==============================] - 31s 109ms/step - loss: 0.1598 - accuracy: 0.9639 - val_loss: 0.4939 - val_accuracy: 0.8893 Epoch 27/150 283/283 [==============================] - 25s 85ms/step - loss: 0.1633 - accuracy: 0.9643 - val_loss: 0.8082 - val_accuracy: 0.8107 Epoch 28/150 283/283 [==============================] - 25s 85ms/step - loss: 0.1693 - accuracy: 0.9644 - val_loss: 0.6440 - val_accuracy: 0.8370 Epoch 29/150 283/283 [==============================] - 24s 85ms/step - loss: 0.1479 - accuracy: 0.9689 - val_loss: 0.9061 - val_accuracy: 0.8400 Epoch 30/150 283/283 [==============================] - 29s 100ms/step - loss: 0.1551 - accuracy: 0.9688 - val_loss: 0.5892 - val_accuracy: 0.8703 Epoch 31/150 283/283 [==============================] - 33s 114ms/step - loss: 0.1345 - accuracy: 0.9724 - val_loss: 0.5668 - val_accuracy: 0.8623 Epoch 32/150 283/283 [==============================] - 32s 111ms/step - loss: 0.1241 - accuracy: 0.9729 - val_loss: 0.4501 - val_accuracy: 0.8940 Epoch 33/150 283/283 [==============================] - 32s 110ms/step - loss: 0.1265 - accuracy: 0.9726 - val_loss: 0.3528 - val_accuracy: 0.9223 Epoch 34/150 283/283 [==============================] - 32s 111ms/step - loss: 0.1512 - accuracy: 0.9672 - val_loss: 0.4429 - val_accuracy: 0.9007 Epoch 35/150 283/283 [==============================] - 32s 111ms/step - loss: 0.1273 - accuracy: 0.9731 - val_loss: 0.5576 - val_accuracy: 0.8863 Epoch 36/150 283/283 [==============================] - 32s 112ms/step - loss: 0.1331 - accuracy: 0.9711 - val_loss: 0.4371 - val_accuracy: 0.9010 Epoch 37/150 283/283 [==============================] - 34s 117ms/step - loss: 0.1087 - accuracy: 0.9767 - val_loss: 0.4523 - val_accuracy: 0.8937 Epoch 38/150 283/283 [==============================] - 34s 119ms/step - loss: 0.1122 - accuracy: 0.9759 - val_loss: 0.4037 - val_accuracy: 0.9110 Epoch 39/150 283/283 [==============================] - 33s 116ms/step - loss: 0.1128 - accuracy: 0.9746 - val_loss: 0.3977 - val_accuracy: 0.9090 Epoch 40/150 283/283 [==============================] - 33s 116ms/step - loss: 0.1075 - accuracy: 0.9771 - val_loss: 0.4938 - val_accuracy: 0.8913 Epoch 41/150 283/283 [==============================] - 31s 107ms/step - loss: 0.1242 - accuracy: 0.9733 - val_loss: 0.5391 - val_accuracy: 0.8780 Epoch 42/150 283/283 [==============================] - 31s 107ms/step - loss: 0.1223 - accuracy: 0.9715 - val_loss: 0.4469 - val_accuracy: 0.8900 Epoch 43/150 283/283 [==============================] - 30s 106ms/step - loss: 0.1196 - accuracy: 0.9732 - val_loss: 1.1554 - val_accuracy: 0.7793 Epoch 44/150 283/283 [==============================] - 33s 114ms/step - loss: 0.1159 - accuracy: 0.9742 - val_loss: 0.3598 - val_accuracy: 0.9217 Epoch 45/150 283/283 [==============================] - 33s 116ms/step - loss: 0.1096 - accuracy: 0.9762 - val_loss: 0.5892 - val_accuracy: 0.8787 Epoch 46/150 283/283 [==============================] - 32s 110ms/step - loss: 0.1007 - accuracy: 0.9766 - val_loss: 1.3659 - val_accuracy: 0.8467 Epoch 47/150 283/283 [==============================] - 31s 106ms/step - loss: 0.1377 - accuracy: 0.9731 - val_loss: 0.5081 - val_accuracy: 0.8983 Epoch 48/150 283/283 [==============================] - 31s 108ms/step - loss: 0.1194 - accuracy: 0.9741 - val_loss: 0.8553 - val_accuracy: 0.8250 Epoch 49/150 283/283 [==============================] - 31s 109ms/step - loss: 0.1036 - accuracy: 0.9750 - val_loss: 0.4313 - val_accuracy: 0.9077 Epoch 50/150 283/283 [==============================] - 31s 107ms/step - loss: 0.1115 - accuracy: 0.9786 - val_loss: 0.4881 - val_accuracy: 0.8910 Epoch 51/150 283/283 [==============================] - 31s 106ms/step - loss: 0.1206 - accuracy: 0.9716 - val_loss: 0.4809 - val_accuracy: 0.8897 Epoch 52/150 283/283 [==============================] - 32s 112ms/step - loss: 0.0955 - accuracy: 0.9793 - val_loss: 0.6768 - val_accuracy: 0.8787 Epoch 53/150 283/283 [==============================] - 34s 116ms/step - loss: 0.1226 - accuracy: 0.9747 - val_loss: 0.4604 - val_accuracy: 0.8893 Epoch 54/150 283/283 [==============================] - 34s 117ms/step - loss: 0.0932 - accuracy: 0.9784 - val_loss: 0.4904 - val_accuracy: 0.8870 Epoch 55/150 283/283 [==============================] - 31s 108ms/step - loss: 0.1046 - accuracy: 0.9756 - val_loss: 1.4689 - val_accuracy: 0.7683 Epoch 56/150 283/283 [==============================] - 31s 108ms/step - loss: 0.1071 - accuracy: 0.9760 - val_loss: 0.4155 - val_accuracy: 0.9160 Epoch 57/150 283/283 [==============================] - 31s 108ms/step - loss: 0.1096 - accuracy: 0.9761 - val_loss: 0.8019 - val_accuracy: 0.8260 Epoch 58/150 283/283 [==============================] - 32s 112ms/step - loss: 0.1604 - accuracy: 0.9730 - val_loss: 0.5163 - val_accuracy: 0.8887 Epoch 59/150 283/283 [==============================] - 31s 108ms/step - loss: 0.1047 - accuracy: 0.9771 - val_loss: 0.4714 - val_accuracy: 0.9020 Epoch 60/150 283/283 [==============================] - 31s 109ms/step - loss: 0.1074 - accuracy: 0.9753 - val_loss: 0.5976 - val_accuracy: 0.8673 Epoch 61/150 283/283 [==============================] - 32s 111ms/step - loss: 0.1105 - accuracy: 0.9784 - val_loss: 0.5041 - val_accuracy: 0.9053 Epoch 62/150 283/283 [==============================] - 31s 108ms/step - loss: 0.0965 - accuracy: 0.9782 - val_loss: 0.4848 - val_accuracy: 0.8973 Epoch 63/150 283/283 [==============================] - 31s 109ms/step - loss: 0.0995 - accuracy: 0.9782 - val_loss: 0.5448 - val_accuracy: 0.8853
res128_inceptionmodel.summary()
Model: "sequential_72"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d_162 (Conv2D) (None, 64, 64, 64) 640
conv2d_163 (Conv2D) (None, 64, 64, 64) 36928
max_pooling2d_66 (MaxPoolin (None, 32, 32, 64) 0
g2D)
conv2d_164 (Conv2D) (None, 32, 32, 64) 36928
conv2d_165 (Conv2D) (None, 16, 16, 64) 36928
max_pooling2d_67 (MaxPoolin (None, 8, 8, 64) 0
g2D)
inception_block1_17 (incept (None, 8, 8, 256) 127424
ionBlock1)
inception_block1_18 (incept (None, 8, 8, 256) 176576
ionBlock1)
inception_block1_19 (incept (None, 8, 8, 256) 176576
ionBlock1)
inception_block2_28 (incept (None, 8, 8, 256) 139904
ionBlock2)
inception_block2_29 (incept (None, 8, 8, 256) 139904
ionBlock2)
inception_block2_30 (incept (None, 8, 8, 256) 139904
ionBlock2)
inception_block2_31 (incept (None, 8, 8, 256) 139904
ionBlock2)
inception_block2_32 (incept (None, 8, 8, 256) 139904
ionBlock2)
inception_block3_11 (incept (None, 8, 8, 384) 152128
ionBlock3)
inception_block3_12 (incept (None, 8, 8, 384) 184896
ionBlock3)
average_pooling2d_6 (Averag (None, 4, 4, 384) 0
ePooling2D)
dropout_6 (Dropout) (None, 4, 4, 384) 0
dense_44 (Dense) (None, 4, 4, 2048) 788480
activity_regularization_15 (None, 4, 4, 2048) 0
(ActivityRegularization)
flatten_15 (Flatten) (None, 32768) 0
dense_45 (Dense) (None, 15) 491535
=================================================================
Total params: 2,908,559
Trainable params: 2,908,559
Non-trainable params: 0
_________________________________________________________________
tf.keras.utils.plot_model(res128_inceptionmodel, show_shapes=True, dpi=80)
finalres128_model_list = [res128_vggmodel, res128_inceptionmodel, baseline128luminosity_model]
finalres128_data_list = [std_res128_validation, std_res128_validation, res128luminosity_validation_data]
name_list = ["Modified VGG-16", "Modified Inception v3", "Baseline CNN"]
cnnf.model_comparison(finalres128_model_list, finalres128_data_list, name_list)
94/94 [==============================] - 5s 46ms/step - loss: 0.1726 - accuracy: 0.9653 94/94 [==============================] - 3s 30ms/step - loss: 0.5448 - accuracy: 0.8853 94/94 [==============================] - 2s 16ms/step - loss: 1.3337 - accuracy: 0.6287
From the above, we can see that the VGG-16 model performs the best, better than both the baseline and the InceptionNet model that we have created.
To help us further improve the performance of the model, we shall attempt to fix the class imbalance, and hypertune the parameters of the model to allow it to acheive higher accuracies. Firstly, we shall use augmentation to try to fix the class imbalance. Earlier on, when attempting to create the model, there was a issue where a certain combination of layers and parameters would result in the models predicting only tomatoes (henceforth called Tomato Model). The Tomato Model may have been caused by several factors, one of which being it being the most populous in the dataset. To help the model better predict the less popular classes, we shall augment those classes to give all classes a "fair" chance.
def augment_dataset(dataset, rotation_range=30, zoom_range=0.2, width_shift_range=0.1, height_shift_range=0.1, horizontal_flip=True):
def augment(image, label):
# Randomly flip the image horizontally
if horizontal_flip:
image = tf.image.random_flip_left_right(image)
image = tf.image.rot90(image, k=tf.random.uniform(shape=[], minval=0, maxval=4, dtype=tf.int32))
image = tf.keras.layers.RandomZoom(zoom_range)(image)
return image, label
return dataset.map(augment)
aug_res31_train = augment_dataset(std_res31_train)
aug_res128_train = augment_dataset(std_res128_train)
WARNING:tensorflow:Using a while_loop for converting RngReadAndSkip cause there is no registered converter for this op. WARNING:tensorflow:Using a while_loop for converting Bitcast cause there is no registered converter for this op. WARNING:tensorflow:Using a while_loop for converting Bitcast cause there is no registered converter for this op. WARNING:tensorflow:Using a while_loop for converting StatelessRandomUniformV2 cause there is no registered converter for this op. WARNING:tensorflow:Using a while_loop for converting ImageProjectiveTransformV3 cause there is no registered converter for this op. WARNING:tensorflow:Using a while_loop for converting RngReadAndSkip cause there is no registered converter for this op. WARNING:tensorflow:Using a while_loop for converting Bitcast cause there is no registered converter for this op. WARNING:tensorflow:Using a while_loop for converting Bitcast cause there is no registered converter for this op. WARNING:tensorflow:Using a while_loop for converting StatelessRandomUniformV2 cause there is no registered converter for this op. WARNING:tensorflow:Using a while_loop for converting ImageProjectiveTransformV3 cause there is no registered converter for this op.
The figure below shows the effect of augmentation on the (31 x 31) dataset.
plt.figure(figsize=(20, 16))
# Take the first grid_size^2 images from the dataset
for i, (image, _) in enumerate(aug_res31_train.unbatch().take(8 ** 2)):
# Add a subplot for each image
plt.subplot(8, 8, i + 1)
plt.imshow(image.numpy(), cmap = "gray")
plt.axis("off")
# Display the plot
plt.show()
The figure below shows the effect of augmentation on the (128 x 128) dataset.
plt.figure(figsize=(20, 16))
# Take the first grid_size^2 images from the dataset
for i, (image, _) in enumerate(aug_res128_train.unbatch().take(8 ** 2)):
# Add a subplot for each image
plt.subplot(8, 8, i + 1)
plt.imshow(image.numpy(), cmap = "gray")
plt.axis("off")
# Display the plot
plt.show()
As we observed earlier, the given dataset is severly imbalanced, with some classes being represented more often than others. To help us combat this, we shall attempt to use class weights to promote a more equal representation of all classes. Class Weights does this by assigning different weights to each class during the training phase, so that it the contribution of each class is balanced. It is typically inversely proportional to the class's frequency in the dataset.
def generate_class_weights(class_series, multi_class=True, one_hot_encoded=False):
if multi_class:
# If class is one hot encoded, transform to categorical labels to use compute_class_weight
if one_hot_encoded:
class_series = np.argmax(class_series, axis=1)
# Compute class weights with sklearn method
class_labels = np.unique(class_series)
class_weights = compute_class_weight(class_weight='balanced', classes=class_labels, y=class_series)
return dict(zip(class_labels, class_weights))
else:
# It is neccessary that the multi-label values are one-hot encoded
mlb = None
if not one_hot_encoded:
mlb = MultiLabelBinarizer()
class_series = mlb.fit_transform(class_series)
n_samples = len(class_series)
n_classes = len(class_series[0])
# Count each class frequency
class_count = [0] * n_classes
for classes in class_series:
for index in range(n_classes):
if classes[index] != 0:
class_count[index] += 1
# Compute class weights using balanced method
class_weights = [n_samples / (n_classes * freq) if freq > 0 else 1 for freq in class_count]
class_labels = range(len(class_weights)) if mlb is None else mlb.classes_
return dict(zip(class_labels, class_weights))
labels = np.array([label.numpy() for _, label in std_res31_train.unbatch()])
class_weights = generate_class_weights(labels)
print(class_weights)
{0: 0.7716239316239316, 1: 0.835925925925926, 2: 1.3647770219198792, 3: 0.693394777265745, 4: 0.8024888888888889, 5: 1.1965540092776674, 6: 1.7147198480531813, 7: 2.3510416666666667, 8: 1.0253265190232823, 9: 0.7412151067323481, 10: 1.0633686690223794, 11: 1.5964633068081344, 12: 0.7393939393939394, 13: 2.4268817204301074, 14: 0.6302268760907505}
Another improvement that can be done to the model is Random Blockout. Random Blockout works by randomly hiding patches in the image, to force the network to seek for other as many other details in the image for object classification. This can be typically done using black pixels, or by filling it with random values. This is typically done to combat overfitting, and bolster accuracy. However, as our VGG architecture(s) both do not overfit to a large extent, we can skip implementing this improvement.
Now, we shall run the best VGG model layout 3 more times, and with 3 different permutations.
Once those models have been trained, we shall compare it to the base model, with none of these improvments applied, so that we can see how it performs relative to each other.
res31_final = Sequential()
res31_final.add(Input(shape = (31, 31, 1)))
res31_final.add(vgg_block(2, 32))
res31_final.add(vgg_block(2, 64))
res31_final.add(vgg_block(3, 128))
res31_final.add(vgg_block(3, 256))
res31_final.add(Flatten())
res31_final.add(Dense(4096, activation = "relu"))
res31_final.add(Dense(4096, activation = "relu"))
res31_final.add(ActivityRegularization(l1 = 0.0001, l2 = 0.0001))
res31_final.add(Dense(num_classes, activation = "softmax"))
res31_final.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy'])
res31_final_cw = clone_model(res31_final)
res31_final_aug = clone_model(res31_final)
res31_final_cw_aug = clone_model(res31_final)
res31_final_cw.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy'])
res31_final_aug.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy'])
res31_final_cw_aug.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy'])
history = res31_final_cw.fit(std_res31_train, validation_data = std_res31_validation, epochs = 150, callbacks = earlyStopper, class_weight = class_weights)
cnnf.model_metric_graph(history, res31_final_cw, std_res31_validation, name = "adapted VGG-13 (with class weights applied)", class_names = class_names)
Epoch 1/150 283/283 [==============================] - 13s 39ms/step - loss: 2.4930 - accuracy: 0.1689 - val_loss: 3.6244 - val_accuracy: 0.0730 Epoch 2/150 283/283 [==============================] - 12s 39ms/step - loss: 1.8793 - accuracy: 0.3681 - val_loss: 2.0493 - val_accuracy: 0.3847 Epoch 3/150 283/283 [==============================] - 12s 40ms/step - loss: 1.5249 - accuracy: 0.4973 - val_loss: 1.6812 - val_accuracy: 0.4467 Epoch 4/150 283/283 [==============================] - 12s 39ms/step - loss: 1.1718 - accuracy: 0.6142 - val_loss: 1.3203 - val_accuracy: 0.5823 Epoch 5/150 283/283 [==============================] - 12s 40ms/step - loss: 0.8995 - accuracy: 0.7159 - val_loss: 0.9450 - val_accuracy: 0.7200 Epoch 6/150 283/283 [==============================] - 12s 42ms/step - loss: 0.7108 - accuracy: 0.7754 - val_loss: 1.0075 - val_accuracy: 0.7387 Epoch 7/150 283/283 [==============================] - 12s 41ms/step - loss: 0.6154 - accuracy: 0.8141 - val_loss: 0.9768 - val_accuracy: 0.7180 Epoch 8/150 283/283 [==============================] - 12s 40ms/step - loss: 0.4802 - accuracy: 0.8519 - val_loss: 0.6890 - val_accuracy: 0.8067 Epoch 9/150 283/283 [==============================] - 12s 40ms/step - loss: 0.4280 - accuracy: 0.8667 - val_loss: 0.6552 - val_accuracy: 0.8177 Epoch 10/150 283/283 [==============================] - 12s 41ms/step - loss: 0.3487 - accuracy: 0.8949 - val_loss: 1.0923 - val_accuracy: 0.7283 Epoch 11/150 283/283 [==============================] - 12s 40ms/step - loss: 0.5172 - accuracy: 0.8429 - val_loss: 0.5424 - val_accuracy: 0.8520 Epoch 12/150 283/283 [==============================] - 11s 37ms/step - loss: 0.3209 - accuracy: 0.9045 - val_loss: 0.6538 - val_accuracy: 0.8290 Epoch 13/150 283/283 [==============================] - 11s 38ms/step - loss: 0.2563 - accuracy: 0.9293 - val_loss: 0.4690 - val_accuracy: 0.8797 Epoch 14/150 283/283 [==============================] - 11s 36ms/step - loss: 0.1926 - accuracy: 0.9449 - val_loss: 0.9545 - val_accuracy: 0.7680 Epoch 15/150 283/283 [==============================] - 11s 36ms/step - loss: 0.1834 - accuracy: 0.9482 - val_loss: 0.4439 - val_accuracy: 0.8867 Epoch 16/150 283/283 [==============================] - 11s 37ms/step - loss: 0.1916 - accuracy: 0.9458 - val_loss: 0.4964 - val_accuracy: 0.8703 Epoch 17/150 283/283 [==============================] - 11s 37ms/step - loss: 0.1518 - accuracy: 0.9561 - val_loss: 0.4947 - val_accuracy: 0.8820 Epoch 18/150 283/283 [==============================] - 11s 36ms/step - loss: 0.1755 - accuracy: 0.9530 - val_loss: 0.5209 - val_accuracy: 0.8750 Epoch 19/150 283/283 [==============================] - 11s 37ms/step - loss: 0.1503 - accuracy: 0.9569 - val_loss: 0.4704 - val_accuracy: 0.8787 Epoch 20/150 283/283 [==============================] - 11s 36ms/step - loss: 0.1366 - accuracy: 0.9638 - val_loss: 0.3949 - val_accuracy: 0.9030 Epoch 21/150 283/283 [==============================] - 11s 37ms/step - loss: 0.1223 - accuracy: 0.9640 - val_loss: 0.3906 - val_accuracy: 0.9073 Epoch 22/150 283/283 [==============================] - 11s 36ms/step - loss: 0.1734 - accuracy: 0.9539 - val_loss: 0.4420 - val_accuracy: 0.8870 Epoch 23/150 283/283 [==============================] - 11s 36ms/step - loss: 0.1912 - accuracy: 0.9475 - val_loss: 0.4733 - val_accuracy: 0.8917 Epoch 24/150 283/283 [==============================] - 11s 37ms/step - loss: 0.1322 - accuracy: 0.9653 - val_loss: 0.5020 - val_accuracy: 0.8823 Epoch 25/150 283/283 [==============================] - 11s 38ms/step - loss: 0.1088 - accuracy: 0.9725 - val_loss: 0.4963 - val_accuracy: 0.8823 Epoch 26/150 283/283 [==============================] - 11s 38ms/step - loss: 0.1159 - accuracy: 0.9687 - val_loss: 0.4366 - val_accuracy: 0.9003 Epoch 27/150 283/283 [==============================] - 11s 38ms/step - loss: 0.1040 - accuracy: 0.9720 - val_loss: 0.4794 - val_accuracy: 0.8880 Epoch 28/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0910 - accuracy: 0.9749 - val_loss: 0.4636 - val_accuracy: 0.8997 Epoch 29/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0680 - accuracy: 0.9822 - val_loss: 0.4260 - val_accuracy: 0.9063 Epoch 30/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0624 - accuracy: 0.9845 - val_loss: 0.4731 - val_accuracy: 0.8887 Epoch 31/150 283/283 [==============================] - 11s 37ms/step - loss: 0.1090 - accuracy: 0.9741 - val_loss: 0.6284 - val_accuracy: 0.8500 Epoch 32/150 283/283 [==============================] - 11s 37ms/step - loss: 0.1113 - accuracy: 0.9693 - val_loss: 0.4593 - val_accuracy: 0.8873 Epoch 33/150 283/283 [==============================] - 11s 38ms/step - loss: 0.0871 - accuracy: 0.9788 - val_loss: 0.5398 - val_accuracy: 0.8823 Epoch 34/150 283/283 [==============================] - 11s 37ms/step - loss: 0.1083 - accuracy: 0.9747 - val_loss: 0.3606 - val_accuracy: 0.9263 Epoch 35/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0580 - accuracy: 0.9868 - val_loss: 0.3895 - val_accuracy: 0.9110 Epoch 36/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0657 - accuracy: 0.9838 - val_loss: 0.4114 - val_accuracy: 0.9120 Epoch 37/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0611 - accuracy: 0.9850 - val_loss: 0.8042 - val_accuracy: 0.8223 Epoch 38/150 283/283 [==============================] - 11s 37ms/step - loss: 0.1597 - accuracy: 0.9560 - val_loss: 0.3442 - val_accuracy: 0.9190 Epoch 39/150 283/283 [==============================] - 11s 37ms/step - loss: 0.1158 - accuracy: 0.9711 - val_loss: 0.3928 - val_accuracy: 0.9087 Epoch 40/150 283/283 [==============================] - 11s 38ms/step - loss: 0.0589 - accuracy: 0.9867 - val_loss: 0.4109 - val_accuracy: 0.9110 Epoch 41/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0484 - accuracy: 0.9877 - val_loss: 0.3428 - val_accuracy: 0.9233 Epoch 42/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0409 - accuracy: 0.9897 - val_loss: 0.4386 - val_accuracy: 0.9130 Epoch 43/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0703 - accuracy: 0.9834 - val_loss: 0.3654 - val_accuracy: 0.9253 Epoch 44/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0792 - accuracy: 0.9780 - val_loss: 0.4137 - val_accuracy: 0.9113 Epoch 45/150 283/283 [==============================] - 11s 37ms/step - loss: 0.1004 - accuracy: 0.9756 - val_loss: 0.4368 - val_accuracy: 0.9077 Epoch 46/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0397 - accuracy: 0.9907 - val_loss: 0.4825 - val_accuracy: 0.9013 Epoch 47/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0550 - accuracy: 0.9858 - val_loss: 0.3459 - val_accuracy: 0.9267 Epoch 48/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0442 - accuracy: 0.9895 - val_loss: 0.5641 - val_accuracy: 0.8803 Epoch 49/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0519 - accuracy: 0.9879 - val_loss: 0.4476 - val_accuracy: 0.9167 Epoch 50/150 283/283 [==============================] - 11s 37ms/step - loss: 0.1270 - accuracy: 0.9698 - val_loss: 0.4857 - val_accuracy: 0.8917 Epoch 51/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0417 - accuracy: 0.9895 - val_loss: 0.3637 - val_accuracy: 0.9200 Epoch 52/150 283/283 [==============================] - 12s 39ms/step - loss: 0.0466 - accuracy: 0.9896 - val_loss: 0.4135 - val_accuracy: 0.9120 Epoch 53/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0561 - accuracy: 0.9860 - val_loss: 0.6074 - val_accuracy: 0.8837 Epoch 54/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0567 - accuracy: 0.9863 - val_loss: 0.3741 - val_accuracy: 0.9250 Epoch 55/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0527 - accuracy: 0.9868 - val_loss: 0.4424 - val_accuracy: 0.9073 Epoch 56/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0538 - accuracy: 0.9876 - val_loss: 0.5402 - val_accuracy: 0.8917 Epoch 57/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0357 - accuracy: 0.9932 - val_loss: 0.4244 - val_accuracy: 0.9213 Epoch 58/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0913 - accuracy: 0.9790 - val_loss: 0.4321 - val_accuracy: 0.9100 Epoch 59/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0563 - accuracy: 0.9878 - val_loss: 0.3447 - val_accuracy: 0.9297 Epoch 60/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0443 - accuracy: 0.9896 - val_loss: 0.4158 - val_accuracy: 0.9190 Epoch 61/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0409 - accuracy: 0.9900 - val_loss: 0.4097 - val_accuracy: 0.9227 Epoch 62/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0710 - accuracy: 0.9812 - val_loss: 0.4956 - val_accuracy: 0.8880 Epoch 63/150 283/283 [==============================] - 11s 39ms/step - loss: 0.0668 - accuracy: 0.9853 - val_loss: 0.4945 - val_accuracy: 0.9023 Epoch 64/150 283/283 [==============================] - 11s 38ms/step - loss: 0.0820 - accuracy: 0.9778 - val_loss: 0.4002 - val_accuracy: 0.9190 Epoch 65/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0897 - accuracy: 0.9782 - val_loss: 0.4384 - val_accuracy: 0.9053 Epoch 66/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0585 - accuracy: 0.9868 - val_loss: 0.3849 - val_accuracy: 0.9203 Epoch 67/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0404 - accuracy: 0.9905 - val_loss: 0.3982 - val_accuracy: 0.9163 Epoch 68/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0285 - accuracy: 0.9951 - val_loss: 0.3511 - val_accuracy: 0.9313 Epoch 69/150 283/283 [==============================] - 11s 38ms/step - loss: 0.0351 - accuracy: 0.9912 - val_loss: 0.3626 - val_accuracy: 0.9260 Epoch 70/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0250 - accuracy: 0.9949 - val_loss: 0.3462 - val_accuracy: 0.9363 Epoch 71/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0393 - accuracy: 0.9919 - val_loss: 0.3976 - val_accuracy: 0.9273
history = res31_final_aug.fit(aug_res31_train, validation_data = std_res31_validation, epochs = 150, callbacks = earlyStopper)
cnnf.model_metric_graph(history, res31_final_aug, std_res31_validation, name = "adapted VGG-13 (with augmented images)", class_names = class_names)
Epoch 1/150 283/283 [==============================] - 13s 41ms/step - loss: 2.4398 - accuracy: 0.1981 - val_loss: 3.1321 - val_accuracy: 0.0757 Epoch 2/150 283/283 [==============================] - 12s 39ms/step - loss: 1.9682 - accuracy: 0.3367 - val_loss: 3.3598 - val_accuracy: 0.2123 Epoch 3/150 283/283 [==============================] - 12s 40ms/step - loss: 1.7056 - accuracy: 0.4354 - val_loss: 1.7748 - val_accuracy: 0.3940 Epoch 4/150 283/283 [==============================] - 12s 39ms/step - loss: 1.5320 - accuracy: 0.4940 - val_loss: 1.6474 - val_accuracy: 0.4640 Epoch 5/150 283/283 [==============================] - 12s 40ms/step - loss: 1.4093 - accuracy: 0.5395 - val_loss: 1.7738 - val_accuracy: 0.4403 Epoch 6/150 283/283 [==============================] - 11s 39ms/step - loss: 1.2703 - accuracy: 0.5812 - val_loss: 2.0637 - val_accuracy: 0.4260 Epoch 7/150 283/283 [==============================] - 12s 39ms/step - loss: 1.1731 - accuracy: 0.6100 - val_loss: 1.4396 - val_accuracy: 0.5390 Epoch 8/150 283/283 [==============================] - 12s 39ms/step - loss: 1.0866 - accuracy: 0.6449 - val_loss: 1.3263 - val_accuracy: 0.5770 Epoch 9/150 283/283 [==============================] - 12s 40ms/step - loss: 1.0516 - accuracy: 0.6578 - val_loss: 1.4566 - val_accuracy: 0.5537 Epoch 10/150 283/283 [==============================] - 12s 40ms/step - loss: 0.9537 - accuracy: 0.6906 - val_loss: 1.2470 - val_accuracy: 0.6027 Epoch 11/150 283/283 [==============================] - 12s 39ms/step - loss: 0.8863 - accuracy: 0.7118 - val_loss: 1.0425 - val_accuracy: 0.6533 Epoch 12/150 283/283 [==============================] - 12s 39ms/step - loss: 0.8300 - accuracy: 0.7273 - val_loss: 0.9525 - val_accuracy: 0.6973 Epoch 13/150 283/283 [==============================] - 12s 40ms/step - loss: 0.7830 - accuracy: 0.7492 - val_loss: 0.9494 - val_accuracy: 0.7030 Epoch 14/150 283/283 [==============================] - 12s 39ms/step - loss: 0.7222 - accuracy: 0.7682 - val_loss: 1.1203 - val_accuracy: 0.6617 Epoch 15/150 283/283 [==============================] - 12s 39ms/step - loss: 0.6776 - accuracy: 0.7852 - val_loss: 0.9030 - val_accuracy: 0.7233 Epoch 16/150 283/283 [==============================] - 12s 39ms/step - loss: 0.6463 - accuracy: 0.7947 - val_loss: 0.8872 - val_accuracy: 0.7373 Epoch 17/150 283/283 [==============================] - 11s 39ms/step - loss: 0.6111 - accuracy: 0.8038 - val_loss: 0.9476 - val_accuracy: 0.7190 Epoch 18/150 283/283 [==============================] - 12s 39ms/step - loss: 0.5957 - accuracy: 0.8134 - val_loss: 0.8002 - val_accuracy: 0.7553 Epoch 19/150 283/283 [==============================] - 11s 39ms/step - loss: 0.5362 - accuracy: 0.8343 - val_loss: 0.6834 - val_accuracy: 0.8037 Epoch 20/150 283/283 [==============================] - 12s 40ms/step - loss: 0.5121 - accuracy: 0.8412 - val_loss: 0.9048 - val_accuracy: 0.7380 Epoch 21/150 283/283 [==============================] - 12s 39ms/step - loss: 0.4808 - accuracy: 0.8496 - val_loss: 0.9669 - val_accuracy: 0.7347 Epoch 22/150 283/283 [==============================] - 12s 39ms/step - loss: 0.4541 - accuracy: 0.8604 - val_loss: 0.7536 - val_accuracy: 0.7890 Epoch 23/150 283/283 [==============================] - 12s 39ms/step - loss: 0.4517 - accuracy: 0.8613 - val_loss: 0.8028 - val_accuracy: 0.7520 Epoch 24/150 283/283 [==============================] - 12s 40ms/step - loss: 0.4091 - accuracy: 0.8747 - val_loss: 0.6852 - val_accuracy: 0.7923 Epoch 25/150 283/283 [==============================] - 12s 40ms/step - loss: 0.4034 - accuracy: 0.8744 - val_loss: 0.5779 - val_accuracy: 0.8090 Epoch 26/150 283/283 [==============================] - 12s 40ms/step - loss: 0.3988 - accuracy: 0.8774 - val_loss: 0.8888 - val_accuracy: 0.7587 Epoch 27/150 283/283 [==============================] - 12s 39ms/step - loss: 0.3613 - accuracy: 0.8907 - val_loss: 0.4563 - val_accuracy: 0.8623 Epoch 28/150 283/283 [==============================] - 12s 39ms/step - loss: 0.3674 - accuracy: 0.8864 - val_loss: 0.5585 - val_accuracy: 0.8437 Epoch 29/150 283/283 [==============================] - 12s 40ms/step - loss: 0.3337 - accuracy: 0.8979 - val_loss: 0.7468 - val_accuracy: 0.7810 Epoch 30/150 283/283 [==============================] - 11s 38ms/step - loss: 0.3342 - accuracy: 0.8971 - val_loss: 0.4933 - val_accuracy: 0.8520 Epoch 31/150 283/283 [==============================] - 11s 38ms/step - loss: 0.3179 - accuracy: 0.9018 - val_loss: 0.4569 - val_accuracy: 0.8700 Epoch 32/150 283/283 [==============================] - 11s 39ms/step - loss: 0.2969 - accuracy: 0.9071 - val_loss: 0.4171 - val_accuracy: 0.8753 Epoch 33/150 283/283 [==============================] - 12s 41ms/step - loss: 0.2819 - accuracy: 0.9176 - val_loss: 0.5433 - val_accuracy: 0.8453 Epoch 34/150 283/283 [==============================] - 12s 40ms/step - loss: 0.3211 - accuracy: 0.9002 - val_loss: 0.4284 - val_accuracy: 0.8720 Epoch 35/150 283/283 [==============================] - 12s 40ms/step - loss: 0.2797 - accuracy: 0.9126 - val_loss: 0.5331 - val_accuracy: 0.8547 Epoch 36/150 283/283 [==============================] - 11s 39ms/step - loss: 0.2515 - accuracy: 0.9237 - val_loss: 0.4836 - val_accuracy: 0.8677 Epoch 37/150 283/283 [==============================] - 12s 39ms/step - loss: 0.2969 - accuracy: 0.9094 - val_loss: 0.4221 - val_accuracy: 0.8733 Epoch 38/150 283/283 [==============================] - 12s 39ms/step - loss: 0.2578 - accuracy: 0.9249 - val_loss: 0.3300 - val_accuracy: 0.9057 Epoch 39/150 283/283 [==============================] - 11s 39ms/step - loss: 0.2460 - accuracy: 0.9249 - val_loss: 0.3838 - val_accuracy: 0.8860 Epoch 40/150 283/283 [==============================] - 11s 39ms/step - loss: 0.2280 - accuracy: 0.9290 - val_loss: 0.4468 - val_accuracy: 0.8843 Epoch 41/150 283/283 [==============================] - 11s 39ms/step - loss: 0.2475 - accuracy: 0.9253 - val_loss: 0.3884 - val_accuracy: 0.8853 Epoch 42/150 283/283 [==============================] - 11s 39ms/step - loss: 0.2169 - accuracy: 0.9364 - val_loss: 0.4329 - val_accuracy: 0.8800 Epoch 43/150 283/283 [==============================] - 11s 39ms/step - loss: 0.1938 - accuracy: 0.9420 - val_loss: 0.3714 - val_accuracy: 0.8980 Epoch 44/150 283/283 [==============================] - 12s 39ms/step - loss: 0.2043 - accuracy: 0.9362 - val_loss: 0.4344 - val_accuracy: 0.8780 Epoch 45/150 283/283 [==============================] - 12s 40ms/step - loss: 0.2223 - accuracy: 0.9318 - val_loss: 0.4860 - val_accuracy: 0.8693 Epoch 46/150 283/283 [==============================] - 12s 40ms/step - loss: 0.2395 - accuracy: 0.9303 - val_loss: 0.3626 - val_accuracy: 0.8983 Epoch 47/150 283/283 [==============================] - 12s 40ms/step - loss: 0.2079 - accuracy: 0.9376 - val_loss: 0.5098 - val_accuracy: 0.8593 Epoch 48/150 283/283 [==============================] - 12s 40ms/step - loss: 0.1850 - accuracy: 0.9467 - val_loss: 0.3839 - val_accuracy: 0.8983 Epoch 49/150 283/283 [==============================] - 11s 38ms/step - loss: 0.1656 - accuracy: 0.9500 - val_loss: 0.4774 - val_accuracy: 0.8720 Epoch 50/150 283/283 [==============================] - 11s 38ms/step - loss: 0.1646 - accuracy: 0.9508 - val_loss: 0.4878 - val_accuracy: 0.8713 Epoch 51/150 283/283 [==============================] - 12s 40ms/step - loss: 0.1722 - accuracy: 0.9502 - val_loss: 0.4132 - val_accuracy: 0.8897 Epoch 52/150 283/283 [==============================] - 11s 39ms/step - loss: 0.1826 - accuracy: 0.9487 - val_loss: 0.3501 - val_accuracy: 0.9033 Epoch 53/150 283/283 [==============================] - 12s 39ms/step - loss: 0.1715 - accuracy: 0.9487 - val_loss: 0.5802 - val_accuracy: 0.8707 Epoch 54/150 283/283 [==============================] - 11s 39ms/step - loss: 0.1932 - accuracy: 0.9420 - val_loss: 0.2726 - val_accuracy: 0.9263 Epoch 55/150 283/283 [==============================] - 11s 39ms/step - loss: 0.1720 - accuracy: 0.9482 - val_loss: 0.3100 - val_accuracy: 0.9103 Epoch 56/150 283/283 [==============================] - 11s 39ms/step - loss: 0.1804 - accuracy: 0.9456 - val_loss: 0.3963 - val_accuracy: 0.8970 Epoch 57/150 283/283 [==============================] - 12s 39ms/step - loss: 0.1529 - accuracy: 0.9559 - val_loss: 0.3867 - val_accuracy: 0.8980 Epoch 58/150 283/283 [==============================] - 11s 38ms/step - loss: 0.1599 - accuracy: 0.9544 - val_loss: 0.3995 - val_accuracy: 0.8937 Epoch 59/150 283/283 [==============================] - 11s 39ms/step - loss: 0.1440 - accuracy: 0.9579 - val_loss: 0.2803 - val_accuracy: 0.9257 Epoch 60/150 283/283 [==============================] - 11s 39ms/step - loss: 0.1552 - accuracy: 0.9544 - val_loss: 0.3434 - val_accuracy: 0.9077 Epoch 61/150 283/283 [==============================] - 11s 39ms/step - loss: 0.1676 - accuracy: 0.9521 - val_loss: 0.3224 - val_accuracy: 0.9117 Epoch 62/150 283/283 [==============================] - 11s 39ms/step - loss: 0.1349 - accuracy: 0.9608 - val_loss: 0.3345 - val_accuracy: 0.9140 Epoch 63/150 283/283 [==============================] - 12s 40ms/step - loss: 0.1440 - accuracy: 0.9580 - val_loss: 0.3906 - val_accuracy: 0.8993 Epoch 64/150 283/283 [==============================] - 11s 39ms/step - loss: 0.1292 - accuracy: 0.9605 - val_loss: 0.3295 - val_accuracy: 0.9053 Epoch 65/150 283/283 [==============================] - 11s 39ms/step - loss: 0.1492 - accuracy: 0.9535 - val_loss: 0.3193 - val_accuracy: 0.9177 Epoch 66/150 283/283 [==============================] - 12s 39ms/step - loss: 0.1341 - accuracy: 0.9600 - val_loss: 0.2630 - val_accuracy: 0.9260 Epoch 67/150 283/283 [==============================] - 12s 39ms/step - loss: 0.1542 - accuracy: 0.9568 - val_loss: 0.4331 - val_accuracy: 0.8877 Epoch 68/150 283/283 [==============================] - 11s 39ms/step - loss: 0.1374 - accuracy: 0.9595 - val_loss: 0.2593 - val_accuracy: 0.9257 Epoch 69/150 283/283 [==============================] - 12s 40ms/step - loss: 0.1272 - accuracy: 0.9640 - val_loss: 0.2852 - val_accuracy: 0.9290 Epoch 70/150 283/283 [==============================] - 12s 39ms/step - loss: 0.1205 - accuracy: 0.9631 - val_loss: 0.2339 - val_accuracy: 0.9403 Epoch 71/150 283/283 [==============================] - 11s 38ms/step - loss: 0.1338 - accuracy: 0.9590 - val_loss: 0.3912 - val_accuracy: 0.9030 Epoch 72/150 283/283 [==============================] - 11s 39ms/step - loss: 0.1460 - accuracy: 0.9577 - val_loss: 0.3865 - val_accuracy: 0.8970 Epoch 73/150 283/283 [==============================] - 11s 39ms/step - loss: 0.1195 - accuracy: 0.9671 - val_loss: 0.2593 - val_accuracy: 0.9337 Epoch 74/150 283/283 [==============================] - 11s 38ms/step - loss: 0.1345 - accuracy: 0.9603 - val_loss: 0.3229 - val_accuracy: 0.9217 Epoch 75/150 283/283 [==============================] - 11s 39ms/step - loss: 0.1908 - accuracy: 0.9432 - val_loss: 0.2515 - val_accuracy: 0.9327 Epoch 76/150 283/283 [==============================] - 11s 38ms/step - loss: 0.1100 - accuracy: 0.9675 - val_loss: 0.2455 - val_accuracy: 0.9327 Epoch 77/150 283/283 [==============================] - 11s 39ms/step - loss: 0.1035 - accuracy: 0.9685 - val_loss: 0.2305 - val_accuracy: 0.9380 Epoch 78/150 283/283 [==============================] - 11s 39ms/step - loss: 0.1015 - accuracy: 0.9705 - val_loss: 0.2674 - val_accuracy: 0.9313 Epoch 79/150 283/283 [==============================] - 12s 40ms/step - loss: 0.1087 - accuracy: 0.9695 - val_loss: 0.2712 - val_accuracy: 0.9293 Epoch 80/150 283/283 [==============================] - 12s 42ms/step - loss: 0.1015 - accuracy: 0.9728 - val_loss: 0.3250 - val_accuracy: 0.9230 Epoch 81/150 283/283 [==============================] - 12s 41ms/step - loss: 0.1056 - accuracy: 0.9693 - val_loss: 0.4741 - val_accuracy: 0.8817 Epoch 82/150 283/283 [==============================] - 12s 41ms/step - loss: 0.1211 - accuracy: 0.9665 - val_loss: 0.3901 - val_accuracy: 0.8987 Epoch 83/150 283/283 [==============================] - 12s 41ms/step - loss: 0.1119 - accuracy: 0.9669 - val_loss: 0.2617 - val_accuracy: 0.9340 Epoch 84/150 283/283 [==============================] - 12s 40ms/step - loss: 0.1135 - accuracy: 0.9658 - val_loss: 0.2549 - val_accuracy: 0.9400 Epoch 85/150 283/283 [==============================] - 12s 40ms/step - loss: 0.1044 - accuracy: 0.9698 - val_loss: 0.2705 - val_accuracy: 0.9327 Epoch 86/150 283/283 [==============================] - 12s 40ms/step - loss: 0.1140 - accuracy: 0.9667 - val_loss: 0.3460 - val_accuracy: 0.9150 Epoch 87/150 283/283 [==============================] - 11s 37ms/step - loss: 0.1174 - accuracy: 0.9659 - val_loss: 0.2441 - val_accuracy: 0.9347 Epoch 88/150 283/283 [==============================] - 11s 38ms/step - loss: 0.0848 - accuracy: 0.9763 - val_loss: 0.2476 - val_accuracy: 0.9380 Epoch 89/150 283/283 [==============================] - 11s 38ms/step - loss: 0.0976 - accuracy: 0.9737 - val_loss: 0.3422 - val_accuracy: 0.9143 Epoch 90/150 283/283 [==============================] - 11s 38ms/step - loss: 0.1236 - accuracy: 0.9640 - val_loss: 0.2674 - val_accuracy: 0.9350 Epoch 91/150 283/283 [==============================] - 11s 38ms/step - loss: 0.1048 - accuracy: 0.9731 - val_loss: 0.2862 - val_accuracy: 0.9383 Epoch 92/150 283/283 [==============================] - 11s 38ms/step - loss: 0.1309 - accuracy: 0.9643 - val_loss: 0.2821 - val_accuracy: 0.9257 Epoch 93/150 283/283 [==============================] - 11s 38ms/step - loss: 0.1016 - accuracy: 0.9704 - val_loss: 0.2455 - val_accuracy: 0.9357 Epoch 94/150 283/283 [==============================] - 11s 37ms/step - loss: 0.1114 - accuracy: 0.9677 - val_loss: 0.2208 - val_accuracy: 0.9407 Epoch 95/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0896 - accuracy: 0.9756 - val_loss: 0.3661 - val_accuracy: 0.9177 Epoch 96/150 283/283 [==============================] - 11s 37ms/step - loss: 0.0801 - accuracy: 0.9775 - val_loss: 0.3216 - val_accuracy: 0.9203 Epoch 97/150 283/283 [==============================] - 11s 37ms/step - loss: 0.1135 - accuracy: 0.9683 - val_loss: 0.2149 - val_accuracy: 0.9470 Epoch 98/150 283/283 [==============================] - 11s 38ms/step - loss: 0.0886 - accuracy: 0.9752 - val_loss: 0.2062 - val_accuracy: 0.9490 Epoch 99/150 283/283 [==============================] - 11s 38ms/step - loss: 0.0715 - accuracy: 0.9798 - val_loss: 0.2152 - val_accuracy: 0.9470 Epoch 100/150 283/283 [==============================] - 12s 40ms/step - loss: 0.0810 - accuracy: 0.9766 - val_loss: 0.3809 - val_accuracy: 0.8963 Epoch 101/150 283/283 [==============================] - 12s 40ms/step - loss: 0.0921 - accuracy: 0.9743 - val_loss: 0.2433 - val_accuracy: 0.9353 Epoch 102/150 283/283 [==============================] - 12s 41ms/step - loss: 0.0834 - accuracy: 0.9755 - val_loss: 0.1996 - val_accuracy: 0.9520 Epoch 103/150 283/283 [==============================] - 12s 41ms/step - loss: 0.0838 - accuracy: 0.9763 - val_loss: 0.3722 - val_accuracy: 0.9177 Epoch 104/150 283/283 [==============================] - 12s 40ms/step - loss: 0.0742 - accuracy: 0.9781 - val_loss: 0.2438 - val_accuracy: 0.9383 Epoch 105/150 283/283 [==============================] - 12s 41ms/step - loss: 0.0906 - accuracy: 0.9750 - val_loss: 0.2601 - val_accuracy: 0.9283 Epoch 106/150 283/283 [==============================] - 12s 41ms/step - loss: 0.0721 - accuracy: 0.9787 - val_loss: 0.2436 - val_accuracy: 0.9403 Epoch 107/150 283/283 [==============================] - 12s 41ms/step - loss: 0.0858 - accuracy: 0.9740 - val_loss: 0.2709 - val_accuracy: 0.9353 Epoch 108/150 283/283 [==============================] - 12s 40ms/step - loss: 0.0931 - accuracy: 0.9726 - val_loss: 0.2368 - val_accuracy: 0.9447 Epoch 109/150 283/283 [==============================] - 12s 40ms/step - loss: 0.1092 - accuracy: 0.9703 - val_loss: 0.2127 - val_accuracy: 0.9460 Epoch 110/150 283/283 [==============================] - 12s 40ms/step - loss: 0.0830 - accuracy: 0.9749 - val_loss: 0.2678 - val_accuracy: 0.9287 Epoch 111/150 283/283 [==============================] - 12s 40ms/step - loss: 0.0896 - accuracy: 0.9753 - val_loss: 0.2090 - val_accuracy: 0.9500 Epoch 112/150 283/283 [==============================] - 12s 40ms/step - loss: 0.0629 - accuracy: 0.9824 - val_loss: 0.3180 - val_accuracy: 0.9233 Epoch 113/150 283/283 [==============================] - 12s 40ms/step - loss: 0.0820 - accuracy: 0.9745 - val_loss: 0.2109 - val_accuracy: 0.9483 Epoch 114/150 283/283 [==============================] - 12s 40ms/step - loss: 0.0783 - accuracy: 0.9778 - val_loss: 0.2404 - val_accuracy: 0.9410 Epoch 115/150 283/283 [==============================] - 12s 41ms/step - loss: 0.0849 - accuracy: 0.9781 - val_loss: 0.2048 - val_accuracy: 0.9450 Epoch 116/150 283/283 [==============================] - 12s 40ms/step - loss: 0.0608 - accuracy: 0.9818 - val_loss: 0.2484 - val_accuracy: 0.9380 Epoch 117/150 283/283 [==============================] - 12s 41ms/step - loss: 0.0780 - accuracy: 0.9765 - val_loss: 0.2212 - val_accuracy: 0.9473 Epoch 118/150 283/283 [==============================] - 12s 40ms/step - loss: 0.0696 - accuracy: 0.9793 - val_loss: 0.2367 - val_accuracy: 0.9400 Epoch 119/150 283/283 [==============================] - 12s 40ms/step - loss: 0.0819 - accuracy: 0.9763 - val_loss: 0.3607 - val_accuracy: 0.9107 Epoch 120/150 283/283 [==============================] - 12s 41ms/step - loss: 0.0887 - accuracy: 0.9736 - val_loss: 0.2148 - val_accuracy: 0.9460 Epoch 121/150 283/283 [==============================] - 12s 40ms/step - loss: 0.0885 - accuracy: 0.9760 - val_loss: 0.2497 - val_accuracy: 0.9383 Epoch 122/150 283/283 [==============================] - 12s 40ms/step - loss: 0.0594 - accuracy: 0.9833 - val_loss: 0.2933 - val_accuracy: 0.9273 Epoch 123/150 283/283 [==============================] - 12s 40ms/step - loss: 0.0838 - accuracy: 0.9763 - val_loss: 0.2785 - val_accuracy: 0.9377 Epoch 124/150 283/283 [==============================] - 12s 40ms/step - loss: 0.0892 - accuracy: 0.9719 - val_loss: 0.2149 - val_accuracy: 0.9497 Epoch 125/150 283/283 [==============================] - 12s 40ms/step - loss: 0.0957 - accuracy: 0.9747 - val_loss: 0.1956 - val_accuracy: 0.9493 Epoch 126/150 283/283 [==============================] - 12s 41ms/step - loss: 0.0576 - accuracy: 0.9839 - val_loss: 0.2282 - val_accuracy: 0.9467 Epoch 127/150 283/283 [==============================] - 12s 41ms/step - loss: 0.0678 - accuracy: 0.9813 - val_loss: 0.2319 - val_accuracy: 0.9457 Epoch 128/150 283/283 [==============================] - 12s 41ms/step - loss: 0.0596 - accuracy: 0.9839 - val_loss: 0.2720 - val_accuracy: 0.9413 Epoch 129/150 283/283 [==============================] - 12s 40ms/step - loss: 0.0615 - accuracy: 0.9828 - val_loss: 0.2339 - val_accuracy: 0.9423 Epoch 130/150 283/283 [==============================] - 12s 41ms/step - loss: 0.0876 - accuracy: 0.9744 - val_loss: 0.2406 - val_accuracy: 0.9480 Epoch 131/150 283/283 [==============================] - 12s 41ms/step - loss: 0.1020 - accuracy: 0.9699 - val_loss: 0.2061 - val_accuracy: 0.9530 Epoch 132/150 283/283 [==============================] - 12s 41ms/step - loss: 0.0628 - accuracy: 0.9831 - val_loss: 0.1728 - val_accuracy: 0.9607 Epoch 133/150 283/283 [==============================] - 12s 40ms/step - loss: 0.0525 - accuracy: 0.9854 - val_loss: 0.2360 - val_accuracy: 0.9470 Epoch 134/150 283/283 [==============================] - 12s 41ms/step - loss: 0.0664 - accuracy: 0.9817 - val_loss: 0.1948 - val_accuracy: 0.9607 Epoch 135/150 283/283 [==============================] - 12s 40ms/step - loss: 0.0519 - accuracy: 0.9846 - val_loss: 0.2283 - val_accuracy: 0.9533 Epoch 136/150 283/283 [==============================] - 12s 40ms/step - loss: 0.0568 - accuracy: 0.9839 - val_loss: 0.2471 - val_accuracy: 0.9440 Epoch 137/150 283/283 [==============================] - 12s 41ms/step - loss: 0.0531 - accuracy: 0.9866 - val_loss: 0.2069 - val_accuracy: 0.9550 Epoch 138/150 283/283 [==============================] - 12s 40ms/step - loss: 0.0722 - accuracy: 0.9795 - val_loss: 0.2429 - val_accuracy: 0.9487 Epoch 139/150 283/283 [==============================] - 12s 41ms/step - loss: 0.0651 - accuracy: 0.9817 - val_loss: 0.2891 - val_accuracy: 0.9397 Epoch 140/150 283/283 [==============================] - 12s 40ms/step - loss: 0.0603 - accuracy: 0.9836 - val_loss: 0.2383 - val_accuracy: 0.9377 Epoch 141/150 283/283 [==============================] - 12s 40ms/step - loss: 0.0757 - accuracy: 0.9790 - val_loss: 0.2025 - val_accuracy: 0.9533 Epoch 142/150 283/283 [==============================] - 12s 41ms/step - loss: 0.0648 - accuracy: 0.9813 - val_loss: 0.2236 - val_accuracy: 0.9497 Epoch 143/150 283/283 [==============================] - 12s 40ms/step - loss: 0.0444 - accuracy: 0.9863 - val_loss: 0.2474 - val_accuracy: 0.9487 Epoch 144/150 283/283 [==============================] - 12s 40ms/step - loss: 0.0627 - accuracy: 0.9835 - val_loss: 0.1888 - val_accuracy: 0.9543 Epoch 145/150 283/283 [==============================] - 12s 41ms/step - loss: 0.0532 - accuracy: 0.9849 - val_loss: 0.2598 - val_accuracy: 0.9387 Epoch 146/150 283/283 [==============================] - 12s 41ms/step - loss: 0.0491 - accuracy: 0.9875 - val_loss: 0.2492 - val_accuracy: 0.9420 Epoch 147/150 283/283 [==============================] - 12s 41ms/step - loss: 0.0513 - accuracy: 0.9847 - val_loss: 0.3151 - val_accuracy: 0.9250 Epoch 148/150 283/283 [==============================] - 12s 40ms/step - loss: 0.0715 - accuracy: 0.9809 - val_loss: 0.3110 - val_accuracy: 0.9323 Epoch 149/150 283/283 [==============================] - 12s 40ms/step - loss: 0.0692 - accuracy: 0.9796 - val_loss: 0.2635 - val_accuracy: 0.9383 Epoch 150/150 283/283 [==============================] - 12s 40ms/step - loss: 0.0606 - accuracy: 0.9840 - val_loss: 0.3343 - val_accuracy: 0.9267
history = res31_final_cw_aug.fit(aug_res31_train, validation_data = std_res31_validation, epochs = 150, callbacks = earlyStopper, class_weight = class_weights)
cnnf.model_metric_graph(history, res31_final_cw_aug, std_res31_validation, name = "adapted VGG-13 (with class weights and augmented images applied)", class_names = class_names)
Epoch 1/150 283/283 [==============================] - 13s 40ms/step - loss: 2.5149 - accuracy: 0.1453 - val_loss: 3.0856 - val_accuracy: 0.0737 Epoch 2/150 283/283 [==============================] - 11s 37ms/step - loss: 2.2190 - accuracy: 0.2105 - val_loss: 2.1430 - val_accuracy: 0.2497 Epoch 3/150 283/283 [==============================] - 11s 38ms/step - loss: 2.0479 - accuracy: 0.2962 - val_loss: 2.3048 - val_accuracy: 0.3147 Epoch 4/150 283/283 [==============================] - 11s 37ms/step - loss: 1.8482 - accuracy: 0.3752 - val_loss: 1.9588 - val_accuracy: 0.3890 Epoch 5/150 283/283 [==============================] - 11s 37ms/step - loss: 1.6860 - accuracy: 0.4356 - val_loss: 1.5664 - val_accuracy: 0.5010 Epoch 6/150 283/283 [==============================] - 11s 38ms/step - loss: 1.5316 - accuracy: 0.4940 - val_loss: 1.5872 - val_accuracy: 0.4677 Epoch 7/150 283/283 [==============================] - 11s 37ms/step - loss: 1.4193 - accuracy: 0.5312 - val_loss: 1.9624 - val_accuracy: 0.4530 Epoch 8/150 283/283 [==============================] - 11s 37ms/step - loss: 1.3166 - accuracy: 0.5535 - val_loss: 2.1403 - val_accuracy: 0.3727 Epoch 9/150 283/283 [==============================] - 11s 38ms/step - loss: 1.2070 - accuracy: 0.5986 - val_loss: 1.4844 - val_accuracy: 0.5483 Epoch 10/150 283/283 [==============================] - 11s 37ms/step - loss: 1.1155 - accuracy: 0.6304 - val_loss: 1.6746 - val_accuracy: 0.5100 Epoch 11/150 283/283 [==============================] - 11s 37ms/step - loss: 1.0579 - accuracy: 0.6459 - val_loss: 1.0563 - val_accuracy: 0.6673 Epoch 12/150 283/283 [==============================] - 11s 39ms/step - loss: 1.0047 - accuracy: 0.6643 - val_loss: 0.9559 - val_accuracy: 0.6867 Epoch 13/150 283/283 [==============================] - 12s 39ms/step - loss: 0.9543 - accuracy: 0.6791 - val_loss: 2.0994 - val_accuracy: 0.4927 Epoch 14/150 283/283 [==============================] - 11s 38ms/step - loss: 1.1690 - accuracy: 0.6121 - val_loss: 1.1864 - val_accuracy: 0.6237 Epoch 15/150 283/283 [==============================] - 11s 38ms/step - loss: 0.9137 - accuracy: 0.7004 - val_loss: 1.0238 - val_accuracy: 0.6867 Epoch 16/150 283/283 [==============================] - 11s 38ms/step - loss: 0.8103 - accuracy: 0.7307 - val_loss: 0.7187 - val_accuracy: 0.7800 Epoch 17/150 283/283 [==============================] - 11s 38ms/step - loss: 0.7591 - accuracy: 0.7460 - val_loss: 0.9362 - val_accuracy: 0.7097 Epoch 18/150 283/283 [==============================] - 12s 39ms/step - loss: 0.7571 - accuracy: 0.7482 - val_loss: 0.8355 - val_accuracy: 0.7327 Epoch 19/150 283/283 [==============================] - 11s 38ms/step - loss: 0.6984 - accuracy: 0.7682 - val_loss: 0.8857 - val_accuracy: 0.7363 Epoch 20/150 283/283 [==============================] - 11s 38ms/step - loss: 0.6822 - accuracy: 0.7730 - val_loss: 0.7157 - val_accuracy: 0.7760 Epoch 21/150 283/283 [==============================] - 11s 39ms/step - loss: 0.5831 - accuracy: 0.8105 - val_loss: 0.7471 - val_accuracy: 0.7750 Epoch 22/150 283/283 [==============================] - 11s 38ms/step - loss: 0.5787 - accuracy: 0.8082 - val_loss: 0.5521 - val_accuracy: 0.8283 Epoch 23/150 283/283 [==============================] - 11s 37ms/step - loss: 0.5417 - accuracy: 0.8179 - val_loss: 0.7530 - val_accuracy: 0.7780 Epoch 24/150 283/283 [==============================] - 11s 38ms/step - loss: 0.5181 - accuracy: 0.8270 - val_loss: 0.5584 - val_accuracy: 0.8227 Epoch 25/150 283/283 [==============================] - 11s 38ms/step - loss: 0.5165 - accuracy: 0.8353 - val_loss: 0.5837 - val_accuracy: 0.8213 Epoch 26/150 283/283 [==============================] - 11s 38ms/step - loss: 0.4991 - accuracy: 0.8360 - val_loss: 1.0233 - val_accuracy: 0.7293 Epoch 27/150 283/283 [==============================] - 11s 38ms/step - loss: 0.4335 - accuracy: 0.8581 - val_loss: 0.6213 - val_accuracy: 0.8237 Epoch 28/150 283/283 [==============================] - 11s 38ms/step - loss: 0.4489 - accuracy: 0.8551 - val_loss: 0.7844 - val_accuracy: 0.7833 Epoch 29/150 283/283 [==============================] - 11s 38ms/step - loss: 0.4265 - accuracy: 0.8600 - val_loss: 0.7857 - val_accuracy: 0.7840 Epoch 30/150 283/283 [==============================] - 11s 38ms/step - loss: 0.4494 - accuracy: 0.8530 - val_loss: 0.4980 - val_accuracy: 0.8500 Epoch 31/150 283/283 [==============================] - 11s 38ms/step - loss: 0.3697 - accuracy: 0.8809 - val_loss: 0.3870 - val_accuracy: 0.8857 Epoch 32/150 283/283 [==============================] - 11s 38ms/step - loss: 0.3694 - accuracy: 0.8779 - val_loss: 0.4751 - val_accuracy: 0.8537 Epoch 33/150 283/283 [==============================] - 11s 38ms/step - loss: 0.3596 - accuracy: 0.8806 - val_loss: 0.4119 - val_accuracy: 0.8760 Epoch 34/150 283/283 [==============================] - 11s 38ms/step - loss: 0.3264 - accuracy: 0.8911 - val_loss: 0.9397 - val_accuracy: 0.7527 Epoch 35/150 283/283 [==============================] - 11s 38ms/step - loss: 0.3324 - accuracy: 0.8902 - val_loss: 0.4271 - val_accuracy: 0.8777 Epoch 36/150 283/283 [==============================] - 11s 38ms/step - loss: 0.3408 - accuracy: 0.8947 - val_loss: 0.5157 - val_accuracy: 0.8527 Epoch 37/150 283/283 [==============================] - 11s 38ms/step - loss: 0.3534 - accuracy: 0.8865 - val_loss: 0.4803 - val_accuracy: 0.8707 Epoch 38/150 283/283 [==============================] - 11s 39ms/step - loss: 0.2958 - accuracy: 0.9012 - val_loss: 0.3601 - val_accuracy: 0.8877 Epoch 39/150 283/283 [==============================] - 10s 35ms/step - loss: 0.2881 - accuracy: 0.9107 - val_loss: 0.4203 - val_accuracy: 0.8690 Epoch 40/150 283/283 [==============================] - 10s 35ms/step - loss: 0.2815 - accuracy: 0.9053 - val_loss: 0.5272 - val_accuracy: 0.8590 Epoch 41/150 283/283 [==============================] - 10s 35ms/step - loss: 0.2678 - accuracy: 0.9126 - val_loss: 0.4809 - val_accuracy: 0.8607 Epoch 42/150 283/283 [==============================] - 10s 35ms/step - loss: 0.2727 - accuracy: 0.9109 - val_loss: 0.3893 - val_accuracy: 0.8933 Epoch 43/150 283/283 [==============================] - 10s 35ms/step - loss: 0.2599 - accuracy: 0.9195 - val_loss: 0.5668 - val_accuracy: 0.8490 Epoch 44/150 283/283 [==============================] - 10s 35ms/step - loss: 0.2802 - accuracy: 0.9149 - val_loss: 0.3140 - val_accuracy: 0.9087 Epoch 45/150 283/283 [==============================] - 10s 35ms/step - loss: 0.2512 - accuracy: 0.9180 - val_loss: 0.4343 - val_accuracy: 0.8823 Epoch 46/150 283/283 [==============================] - 10s 35ms/step - loss: 0.2404 - accuracy: 0.9225 - val_loss: 0.3760 - val_accuracy: 0.8950 Epoch 47/150 283/283 [==============================] - 10s 35ms/step - loss: 0.2278 - accuracy: 0.9270 - val_loss: 0.3735 - val_accuracy: 0.8927 Epoch 48/150 283/283 [==============================] - 10s 35ms/step - loss: 0.2096 - accuracy: 0.9329 - val_loss: 0.2525 - val_accuracy: 0.9273 Epoch 49/150 283/283 [==============================] - 10s 35ms/step - loss: 0.2155 - accuracy: 0.9309 - val_loss: 0.3685 - val_accuracy: 0.8893 Epoch 50/150 283/283 [==============================] - 10s 35ms/step - loss: 0.2020 - accuracy: 0.9333 - val_loss: 0.3377 - val_accuracy: 0.9070 Epoch 51/150 283/283 [==============================] - 10s 35ms/step - loss: 0.2058 - accuracy: 0.9354 - val_loss: 0.4763 - val_accuracy: 0.8667 Epoch 52/150 283/283 [==============================] - 10s 35ms/step - loss: 0.2419 - accuracy: 0.9248 - val_loss: 0.3435 - val_accuracy: 0.9000 Epoch 53/150 283/283 [==============================] - 10s 35ms/step - loss: 0.2275 - accuracy: 0.9246 - val_loss: 0.3159 - val_accuracy: 0.9063 Epoch 54/150 283/283 [==============================] - 10s 35ms/step - loss: 0.2216 - accuracy: 0.9304 - val_loss: 0.3660 - val_accuracy: 0.9020 Epoch 55/150 283/283 [==============================] - 10s 35ms/step - loss: 0.1978 - accuracy: 0.9379 - val_loss: 0.3105 - val_accuracy: 0.9133 Epoch 56/150 283/283 [==============================] - 10s 35ms/step - loss: 0.2227 - accuracy: 0.9286 - val_loss: 0.3057 - val_accuracy: 0.9107 Epoch 57/150 283/283 [==============================] - 10s 35ms/step - loss: 0.2303 - accuracy: 0.9258 - val_loss: 0.3883 - val_accuracy: 0.8900 Epoch 58/150 283/283 [==============================] - 10s 35ms/step - loss: 0.1750 - accuracy: 0.9421 - val_loss: 0.2655 - val_accuracy: 0.9253 Epoch 59/150 283/283 [==============================] - 10s 35ms/step - loss: 0.2428 - accuracy: 0.9220 - val_loss: 0.3117 - val_accuracy: 0.9067 Epoch 60/150 283/283 [==============================] - 11s 36ms/step - loss: 0.1583 - accuracy: 0.9499 - val_loss: 0.2866 - val_accuracy: 0.9243 Epoch 61/150 283/283 [==============================] - 10s 35ms/step - loss: 0.2419 - accuracy: 0.9262 - val_loss: 0.2558 - val_accuracy: 0.9300 Epoch 62/150 283/283 [==============================] - 10s 35ms/step - loss: 0.1864 - accuracy: 0.9380 - val_loss: 0.2828 - val_accuracy: 0.9223 Epoch 63/150 283/283 [==============================] - 10s 35ms/step - loss: 0.1524 - accuracy: 0.9495 - val_loss: 0.3207 - val_accuracy: 0.9163 Epoch 64/150 283/283 [==============================] - 10s 35ms/step - loss: 0.1556 - accuracy: 0.9533 - val_loss: 0.2774 - val_accuracy: 0.9250 Epoch 65/150 283/283 [==============================] - 10s 35ms/step - loss: 0.1495 - accuracy: 0.9548 - val_loss: 0.2605 - val_accuracy: 0.9373 Epoch 66/150 283/283 [==============================] - 11s 36ms/step - loss: 0.1595 - accuracy: 0.9466 - val_loss: 0.2504 - val_accuracy: 0.9330 Epoch 67/150 283/283 [==============================] - 10s 35ms/step - loss: 0.1534 - accuracy: 0.9528 - val_loss: 0.3194 - val_accuracy: 0.9110 Epoch 68/150 283/283 [==============================] - 10s 35ms/step - loss: 0.1536 - accuracy: 0.9529 - val_loss: 0.3813 - val_accuracy: 0.8997 Epoch 69/150 283/283 [==============================] - 10s 35ms/step - loss: 0.1926 - accuracy: 0.9433 - val_loss: 0.2857 - val_accuracy: 0.9207 Epoch 70/150 283/283 [==============================] - 10s 35ms/step - loss: 0.1357 - accuracy: 0.9527 - val_loss: 0.2566 - val_accuracy: 0.9300 Epoch 71/150 283/283 [==============================] - 10s 35ms/step - loss: 0.1391 - accuracy: 0.9566 - val_loss: 0.2806 - val_accuracy: 0.9273 Epoch 72/150 283/283 [==============================] - 10s 35ms/step - loss: 0.1489 - accuracy: 0.9515 - val_loss: 0.5356 - val_accuracy: 0.8610 Epoch 73/150 283/283 [==============================] - 10s 35ms/step - loss: 0.1474 - accuracy: 0.9525 - val_loss: 0.2196 - val_accuracy: 0.9447 Epoch 74/150 283/283 [==============================] - 10s 35ms/step - loss: 0.1370 - accuracy: 0.9537 - val_loss: 0.4135 - val_accuracy: 0.8980 Epoch 75/150 283/283 [==============================] - 10s 35ms/step - loss: 0.1428 - accuracy: 0.9550 - val_loss: 0.2680 - val_accuracy: 0.9270 Epoch 76/150 283/283 [==============================] - 10s 35ms/step - loss: 0.1522 - accuracy: 0.9546 - val_loss: 0.2470 - val_accuracy: 0.9367 Epoch 77/150 283/283 [==============================] - 10s 35ms/step - loss: 0.1219 - accuracy: 0.9595 - val_loss: 0.2811 - val_accuracy: 0.9260 Epoch 78/150 283/283 [==============================] - 10s 35ms/step - loss: 0.1312 - accuracy: 0.9598 - val_loss: 0.2948 - val_accuracy: 0.9243 Epoch 79/150 283/283 [==============================] - 10s 35ms/step - loss: 0.1429 - accuracy: 0.9560 - val_loss: 0.2648 - val_accuracy: 0.9290 Epoch 80/150 283/283 [==============================] - 11s 37ms/step - loss: 0.1308 - accuracy: 0.9599 - val_loss: 0.2565 - val_accuracy: 0.9300 Epoch 81/150 283/283 [==============================] - 11s 36ms/step - loss: 0.1240 - accuracy: 0.9610 - val_loss: 0.4957 - val_accuracy: 0.8673 Epoch 82/150 283/283 [==============================] - 10s 35ms/step - loss: 0.1225 - accuracy: 0.9592 - val_loss: 0.2080 - val_accuracy: 0.9457 Epoch 83/150 283/283 [==============================] - 11s 37ms/step - loss: 0.1868 - accuracy: 0.9446 - val_loss: 0.2859 - val_accuracy: 0.9283 Epoch 84/150 283/283 [==============================] - 11s 36ms/step - loss: 0.1284 - accuracy: 0.9598 - val_loss: 0.2112 - val_accuracy: 0.9447 Epoch 85/150 283/283 [==============================] - 11s 36ms/step - loss: 0.1457 - accuracy: 0.9575 - val_loss: 0.2161 - val_accuracy: 0.9437 Epoch 86/150 283/283 [==============================] - 11s 36ms/step - loss: 0.1260 - accuracy: 0.9608 - val_loss: 0.3359 - val_accuracy: 0.9110 Epoch 87/150 283/283 [==============================] - 11s 36ms/step - loss: 0.1255 - accuracy: 0.9649 - val_loss: 0.2728 - val_accuracy: 0.9260 Epoch 88/150 283/283 [==============================] - 11s 36ms/step - loss: 0.0968 - accuracy: 0.9701 - val_loss: 0.4084 - val_accuracy: 0.8993 Epoch 89/150 283/283 [==============================] - 11s 36ms/step - loss: 0.1103 - accuracy: 0.9639 - val_loss: 0.1984 - val_accuracy: 0.9483 Epoch 90/150 283/283 [==============================] - 11s 36ms/step - loss: 0.1077 - accuracy: 0.9664 - val_loss: 0.2646 - val_accuracy: 0.9363 Epoch 91/150 283/283 [==============================] - 11s 36ms/step - loss: 0.1179 - accuracy: 0.9665 - val_loss: 0.2522 - val_accuracy: 0.9333 Epoch 92/150 283/283 [==============================] - 11s 36ms/step - loss: 0.1243 - accuracy: 0.9634 - val_loss: 0.3316 - val_accuracy: 0.9163 Epoch 93/150 283/283 [==============================] - 10s 35ms/step - loss: 0.1188 - accuracy: 0.9617 - val_loss: 0.2566 - val_accuracy: 0.9323 Epoch 94/150 283/283 [==============================] - 10s 35ms/step - loss: 0.0981 - accuracy: 0.9677 - val_loss: 0.4244 - val_accuracy: 0.9033 Epoch 95/150 283/283 [==============================] - 10s 35ms/step - loss: 0.0938 - accuracy: 0.9711 - val_loss: 0.2364 - val_accuracy: 0.9433 Epoch 96/150 283/283 [==============================] - 10s 35ms/step - loss: 0.1153 - accuracy: 0.9634 - val_loss: 0.2433 - val_accuracy: 0.9283 Epoch 97/150 283/283 [==============================] - 10s 35ms/step - loss: 0.1042 - accuracy: 0.9658 - val_loss: 0.4172 - val_accuracy: 0.8870 Epoch 98/150 283/283 [==============================] - 10s 35ms/step - loss: 0.0904 - accuracy: 0.9729 - val_loss: 0.3618 - val_accuracy: 0.9133 Epoch 99/150 283/283 [==============================] - 11s 36ms/step - loss: 0.0957 - accuracy: 0.9722 - val_loss: 0.2508 - val_accuracy: 0.9330 Epoch 100/150 283/283 [==============================] - 11s 35ms/step - loss: 0.0949 - accuracy: 0.9695 - val_loss: 0.3296 - val_accuracy: 0.9220 Epoch 101/150 283/283 [==============================] - 11s 36ms/step - loss: 0.1316 - accuracy: 0.9601 - val_loss: 0.3467 - val_accuracy: 0.9093 Epoch 102/150 283/283 [==============================] - 10s 35ms/step - loss: 0.1065 - accuracy: 0.9677 - val_loss: 0.2188 - val_accuracy: 0.9450 Epoch 103/150 283/283 [==============================] - 10s 35ms/step - loss: 0.1047 - accuracy: 0.9652 - val_loss: 0.2321 - val_accuracy: 0.9410 Epoch 104/150 283/283 [==============================] - 10s 35ms/step - loss: 0.1094 - accuracy: 0.9662 - val_loss: 0.3980 - val_accuracy: 0.8990 Epoch 105/150 283/283 [==============================] - 10s 35ms/step - loss: 0.0991 - accuracy: 0.9726 - val_loss: 0.2598 - val_accuracy: 0.9350 Epoch 106/150 283/283 [==============================] - 10s 35ms/step - loss: 0.0665 - accuracy: 0.9795 - val_loss: 0.2365 - val_accuracy: 0.9420 Epoch 107/150 283/283 [==============================] - 10s 35ms/step - loss: 0.0843 - accuracy: 0.9744 - val_loss: 0.2373 - val_accuracy: 0.9370 Epoch 108/150 283/283 [==============================] - 10s 35ms/step - loss: 0.1062 - accuracy: 0.9685 - val_loss: 0.3143 - val_accuracy: 0.9163 Epoch 109/150 283/283 [==============================] - 10s 35ms/step - loss: 0.1069 - accuracy: 0.9682 - val_loss: 0.2349 - val_accuracy: 0.9427 Epoch 110/150 283/283 [==============================] - 10s 35ms/step - loss: 0.0953 - accuracy: 0.9734 - val_loss: 0.3617 - val_accuracy: 0.9153 Epoch 111/150 283/283 [==============================] - 10s 35ms/step - loss: 0.0714 - accuracy: 0.9782 - val_loss: 0.3045 - val_accuracy: 0.9193 Epoch 112/150 283/283 [==============================] - 10s 35ms/step - loss: 0.0765 - accuracy: 0.9783 - val_loss: 0.2720 - val_accuracy: 0.9353 Epoch 113/150 283/283 [==============================] - 11s 36ms/step - loss: 0.0848 - accuracy: 0.9742 - val_loss: 0.2748 - val_accuracy: 0.9277 Epoch 114/150 283/283 [==============================] - 10s 35ms/step - loss: 0.1098 - accuracy: 0.9657 - val_loss: 0.2828 - val_accuracy: 0.9257 Epoch 115/150 283/283 [==============================] - 10s 35ms/step - loss: 0.0797 - accuracy: 0.9766 - val_loss: 0.3115 - val_accuracy: 0.9253 Epoch 116/150 283/283 [==============================] - 10s 35ms/step - loss: 0.1159 - accuracy: 0.9608 - val_loss: 0.2272 - val_accuracy: 0.9400 Epoch 117/150 283/283 [==============================] - 10s 35ms/step - loss: 0.0870 - accuracy: 0.9706 - val_loss: 0.2055 - val_accuracy: 0.9533 Epoch 118/150 283/283 [==============================] - 10s 35ms/step - loss: 0.0770 - accuracy: 0.9764 - val_loss: 0.2620 - val_accuracy: 0.9497 Epoch 119/150 283/283 [==============================] - 11s 36ms/step - loss: 0.0984 - accuracy: 0.9732 - val_loss: 0.2034 - val_accuracy: 0.9473
improvement31_model_list = [res31_vggmodel, res31_final_cw, res31_final_aug, res31_final_cw_aug]
improvement31_data_list = [std_res31_validation, std_res31_validation, std_res31_validation, std_res31_validation]
name_list = ["VGG Model with no improvements", "Class Weights Applied", "Augmentation Applied", "Class Weights and Augmentation Applied"]
cnnf.model_comparison(improvement31_model_list, improvement31_data_list, name_list)
94/94 [==============================] - 2s 16ms/step - loss: 0.4841 - accuracy: 0.8953 94/94 [==============================] - 2s 15ms/step - loss: 0.3976 - accuracy: 0.9273 94/94 [==============================] - 2s 15ms/step - loss: 0.3343 - accuracy: 0.9267 94/94 [==============================] - 2s 15ms/step - loss: 0.2034 - accuracy: 0.9473
From the above, we can see that all of the improvements that we attempted have had a improvement on the model's performance, with the application of Class Weights and Augmentation having the most profound improvement, with accuracy and loss both being the best performing among all the 4 permutations. Hence, we shall now select this model and hypertune it.
From this, we can conclude that augmentation is useful for the 31 x 31 image input size.
We shall save the model that we want to hypertune first, to allow the hypertuning to be seperate from the other models, saving on computational resources.
res31_final_cw_aug.save('res31_best.h5')
Now, we shall run the best VGG model layout 3 more times, and with 3 different permutations.
Once those models have been trained, we shall compare it to the base model, with none of these improvments applied, so that we can see how it performs relative to each other.
res128_final = Sequential()
res128_final.add(Input(shape = (128, 128, 1)))
res128_final.add(vgg_block(2, 32))
res128_final.add(vgg_block(2, 64))
res128_final.add(vgg_block(3, 128))
res128_final.add(vgg_block(3, 256))
res128_final.add(vgg_block(3, 256))
res128_final.add(Dense(4096, activation = "relu"))
res128_final.add(Dense(4096, activation = "relu"))
res128_final.add(ActivityRegularization(l1 = 0.00001, l2 = 0.00001))
res128_final.add(Flatten())
res128_final.add(Dense(num_classes, activation = "softmax"))
res128_final.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy'])
res128_final_cw = clone_model(res128_final)
res128_final_aug = clone_model(res128_final)
res128_final_cw_aug = clone_model(res128_final)
res128_final_cw.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy'])
res128_final_aug.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy'])
res128_final_cw_aug.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy'])
history = res128_final_cw.fit(std_res128_train, validation_data = std_res128_validation, epochs = 150, callbacks = earlyStopper, class_weight = class_weights)
cnnf.model_metric_graph(history, res128_final_cw, std_res31_validation, name = "adapted VGG-16 (with class weights applied)", class_names = class_names)
Epoch 1/150 283/283 [==============================] - 53s 179ms/step - loss: 2.3629 - accuracy: 0.2233 - val_loss: 3.3180 - val_accuracy: 0.0667 Epoch 2/150 283/283 [==============================] - 50s 177ms/step - loss: 1.6899 - accuracy: 0.4466 - val_loss: 2.0270 - val_accuracy: 0.4310 Epoch 3/150 283/283 [==============================] - 51s 179ms/step - loss: 1.3144 - accuracy: 0.5714 - val_loss: 1.5473 - val_accuracy: 0.5450 Epoch 4/150 283/283 [==============================] - 50s 176ms/step - loss: 1.0891 - accuracy: 0.6382 - val_loss: 1.5125 - val_accuracy: 0.5380 Epoch 5/150 283/283 [==============================] - 50s 177ms/step - loss: 0.9119 - accuracy: 0.6942 - val_loss: 1.7319 - val_accuracy: 0.4853 Epoch 6/150 283/283 [==============================] - 50s 176ms/step - loss: 0.7636 - accuracy: 0.7484 - val_loss: 1.0949 - val_accuracy: 0.6930 Epoch 7/150 283/283 [==============================] - 50s 176ms/step - loss: 0.6525 - accuracy: 0.7871 - val_loss: 1.1454 - val_accuracy: 0.6877 Epoch 8/150 283/283 [==============================] - 50s 176ms/step - loss: 0.5617 - accuracy: 0.8108 - val_loss: 0.8808 - val_accuracy: 0.7130 Epoch 9/150 283/283 [==============================] - 50s 177ms/step - loss: 0.4703 - accuracy: 0.8443 - val_loss: 0.6634 - val_accuracy: 0.7907 Epoch 10/150 283/283 [==============================] - 52s 181ms/step - loss: 0.4420 - accuracy: 0.8539 - val_loss: 0.6856 - val_accuracy: 0.8153 Epoch 11/150 283/283 [==============================] - 51s 178ms/step - loss: 0.3386 - accuracy: 0.8848 - val_loss: 0.7040 - val_accuracy: 0.8057 Epoch 12/150 283/283 [==============================] - 50s 177ms/step - loss: 0.2894 - accuracy: 0.9031 - val_loss: 0.4977 - val_accuracy: 0.8597 Epoch 13/150 283/283 [==============================] - 50s 176ms/step - loss: 0.3191 - accuracy: 0.8929 - val_loss: 0.4525 - val_accuracy: 0.8717 Epoch 14/150 283/283 [==============================] - 51s 179ms/step - loss: 0.2634 - accuracy: 0.9129 - val_loss: 0.5365 - val_accuracy: 0.8400 Epoch 15/150 283/283 [==============================] - 51s 179ms/step - loss: 0.3445 - accuracy: 0.8890 - val_loss: 0.4246 - val_accuracy: 0.8687 Epoch 16/150 283/283 [==============================] - 51s 178ms/step - loss: 0.2129 - accuracy: 0.9335 - val_loss: 0.3143 - val_accuracy: 0.9133 Epoch 17/150 283/283 [==============================] - 51s 180ms/step - loss: 0.1800 - accuracy: 0.9405 - val_loss: 0.5567 - val_accuracy: 0.8617 Epoch 18/150 283/283 [==============================] - 51s 179ms/step - loss: 0.2519 - accuracy: 0.9198 - val_loss: 0.4602 - val_accuracy: 0.8780 Epoch 19/150 283/283 [==============================] - 51s 179ms/step - loss: 0.1488 - accuracy: 0.9504 - val_loss: 0.4120 - val_accuracy: 0.8960 Epoch 20/150 283/283 [==============================] - 51s 177ms/step - loss: 0.1131 - accuracy: 0.9630 - val_loss: 0.3844 - val_accuracy: 0.8977 Epoch 21/150 283/283 [==============================] - 50s 177ms/step - loss: 0.0933 - accuracy: 0.9673 - val_loss: 0.5925 - val_accuracy: 0.8610 Epoch 22/150 283/283 [==============================] - 51s 178ms/step - loss: 0.1206 - accuracy: 0.9599 - val_loss: 0.3952 - val_accuracy: 0.9010 Epoch 23/150 283/283 [==============================] - 52s 181ms/step - loss: 0.2825 - accuracy: 0.9114 - val_loss: 0.5023 - val_accuracy: 0.8720 Epoch 24/150 283/283 [==============================] - 52s 182ms/step - loss: 0.2050 - accuracy: 0.9311 - val_loss: 0.3410 - val_accuracy: 0.9100 Epoch 25/150 283/283 [==============================] - 53s 187ms/step - loss: 0.0992 - accuracy: 0.9659 - val_loss: 0.3153 - val_accuracy: 0.9187 Epoch 26/150 283/283 [==============================] - 51s 180ms/step - loss: 0.0931 - accuracy: 0.9700 - val_loss: 0.3916 - val_accuracy: 0.9060 Epoch 27/150 283/283 [==============================] - 51s 179ms/step - loss: 0.1013 - accuracy: 0.9681 - val_loss: 0.2619 - val_accuracy: 0.9353 Epoch 28/150 283/283 [==============================] - 51s 179ms/step - loss: 0.0530 - accuracy: 0.9839 - val_loss: 0.2774 - val_accuracy: 0.9397 Epoch 29/150 283/283 [==============================] - 51s 179ms/step - loss: 0.0755 - accuracy: 0.9774 - val_loss: 0.6310 - val_accuracy: 0.8490 Epoch 30/150 283/283 [==============================] - 51s 180ms/step - loss: 0.0728 - accuracy: 0.9770 - val_loss: 0.5339 - val_accuracy: 0.8737 Epoch 31/150 283/283 [==============================] - 52s 181ms/step - loss: 0.0952 - accuracy: 0.9715 - val_loss: 0.7391 - val_accuracy: 0.8370 Epoch 32/150 283/283 [==============================] - 51s 179ms/step - loss: 0.0972 - accuracy: 0.9678 - val_loss: 0.3420 - val_accuracy: 0.9237 Epoch 33/150 283/283 [==============================] - 51s 178ms/step - loss: 0.0901 - accuracy: 0.9729 - val_loss: 0.6628 - val_accuracy: 0.8283 Epoch 34/150 283/283 [==============================] - 51s 179ms/step - loss: 0.2124 - accuracy: 0.9329 - val_loss: 0.2955 - val_accuracy: 0.9243 Epoch 35/150 283/283 [==============================] - 51s 178ms/step - loss: 0.0657 - accuracy: 0.9795 - val_loss: 0.2813 - val_accuracy: 0.9303 Epoch 36/150 283/283 [==============================] - 53s 186ms/step - loss: 0.0378 - accuracy: 0.9879 - val_loss: 0.2979 - val_accuracy: 0.9367 Epoch 37/150 283/283 [==============================] - 54s 189ms/step - loss: 0.0785 - accuracy: 0.9747 - val_loss: 0.3274 - val_accuracy: 0.9300 Epoch 38/150 283/283 [==============================] - 54s 189ms/step - loss: 0.1107 - accuracy: 0.9648 - val_loss: 0.3199 - val_accuracy: 0.9267 Epoch 39/150 283/283 [==============================] - 53s 186ms/step - loss: 0.0665 - accuracy: 0.9782 - val_loss: 0.2493 - val_accuracy: 0.9453 Epoch 40/150 283/283 [==============================] - 53s 184ms/step - loss: 0.0833 - accuracy: 0.9746 - val_loss: 0.2715 - val_accuracy: 0.9353 Epoch 41/150 283/283 [==============================] - 53s 187ms/step - loss: 0.0517 - accuracy: 0.9850 - val_loss: 0.3346 - val_accuracy: 0.9250 Epoch 42/150 283/283 [==============================] - 53s 186ms/step - loss: 0.0445 - accuracy: 0.9868 - val_loss: 0.2910 - val_accuracy: 0.9343 Epoch 43/150 283/283 [==============================] - 57s 201ms/step - loss: 0.0293 - accuracy: 0.9924 - val_loss: 0.2413 - val_accuracy: 0.9423 Epoch 44/150 283/283 [==============================] - 56s 197ms/step - loss: 0.0522 - accuracy: 0.9849 - val_loss: 0.2749 - val_accuracy: 0.9417 Epoch 45/150 283/283 [==============================] - 56s 197ms/step - loss: 0.0806 - accuracy: 0.9771 - val_loss: 0.2593 - val_accuracy: 0.9403 Epoch 46/150 283/283 [==============================] - 54s 188ms/step - loss: 0.0274 - accuracy: 0.9928 - val_loss: 0.3371 - val_accuracy: 0.9313 Epoch 47/150 283/283 [==============================] - 51s 180ms/step - loss: 0.1179 - accuracy: 0.9638 - val_loss: 0.3566 - val_accuracy: 0.9160 Epoch 48/150 283/283 [==============================] - 52s 181ms/step - loss: 0.0557 - accuracy: 0.9824 - val_loss: 0.2881 - val_accuracy: 0.9343 Epoch 49/150 283/283 [==============================] - 52s 184ms/step - loss: 0.0412 - accuracy: 0.9880 - val_loss: 0.4669 - val_accuracy: 0.8910 Epoch 50/150 283/283 [==============================] - 52s 183ms/step - loss: 0.0850 - accuracy: 0.9764 - val_loss: 0.2608 - val_accuracy: 0.9467 Epoch 51/150 283/283 [==============================] - 52s 182ms/step - loss: 0.0333 - accuracy: 0.9917 - val_loss: 0.2857 - val_accuracy: 0.9403 Epoch 52/150 283/283 [==============================] - 51s 180ms/step - loss: 0.0757 - accuracy: 0.9764 - val_loss: 0.3082 - val_accuracy: 0.9343 Epoch 53/150 283/283 [==============================] - 52s 183ms/step - loss: 0.0345 - accuracy: 0.9895 - val_loss: 0.3128 - val_accuracy: 0.9263 Epoch 54/150 283/283 [==============================] - 52s 181ms/step - loss: 0.1106 - accuracy: 0.9640 - val_loss: 0.2644 - val_accuracy: 0.9327 Epoch 55/150 283/283 [==============================] - 52s 182ms/step - loss: 0.0569 - accuracy: 0.9811 - val_loss: 0.2570 - val_accuracy: 0.9437 Epoch 56/150 283/283 [==============================] - 52s 183ms/step - loss: 0.0401 - accuracy: 0.9869 - val_loss: 0.2285 - val_accuracy: 0.9510 Epoch 57/150 283/283 [==============================] - 51s 180ms/step - loss: 0.0279 - accuracy: 0.9925 - val_loss: 0.2883 - val_accuracy: 0.9360 Epoch 58/150 283/283 [==============================] - 52s 181ms/step - loss: 0.0413 - accuracy: 0.9875 - val_loss: 0.3526 - val_accuracy: 0.9207 Epoch 59/150 283/283 [==============================] - 52s 181ms/step - loss: 0.0329 - accuracy: 0.9893 - val_loss: 0.2770 - val_accuracy: 0.9453 Epoch 60/150 283/283 [==============================] - 51s 180ms/step - loss: 0.0359 - accuracy: 0.9890 - val_loss: 0.2392 - val_accuracy: 0.9500 Epoch 61/150 283/283 [==============================] - 52s 181ms/step - loss: 0.0287 - accuracy: 0.9922 - val_loss: 0.3243 - val_accuracy: 0.9373 Epoch 62/150 283/283 [==============================] - 52s 182ms/step - loss: 0.0878 - accuracy: 0.9749 - val_loss: 0.2909 - val_accuracy: 0.9343 Epoch 63/150 283/283 [==============================] - 51s 180ms/step - loss: 0.0237 - accuracy: 0.9937 - val_loss: 0.2827 - val_accuracy: 0.9367 Epoch 64/150 283/283 [==============================] - 52s 183ms/step - loss: 0.0254 - accuracy: 0.9931 - val_loss: 0.4546 - val_accuracy: 0.9000 Epoch 65/150 283/283 [==============================] - 51s 180ms/step - loss: 0.0225 - accuracy: 0.9938 - val_loss: 0.1997 - val_accuracy: 0.9570 Epoch 66/150 283/283 [==============================] - 52s 182ms/step - loss: 0.0136 - accuracy: 0.9970 - val_loss: 0.2603 - val_accuracy: 0.9443 Epoch 67/150 283/283 [==============================] - 52s 182ms/step - loss: 0.0801 - accuracy: 0.9733 - val_loss: 0.2942 - val_accuracy: 0.9387 Epoch 68/150 283/283 [==============================] - 53s 186ms/step - loss: 0.0368 - accuracy: 0.9899 - val_loss: 0.2751 - val_accuracy: 0.9407 Epoch 69/150 283/283 [==============================] - 55s 191ms/step - loss: 0.0376 - accuracy: 0.9890 - val_loss: 0.2301 - val_accuracy: 0.9540 Epoch 70/150 283/283 [==============================] - 54s 189ms/step - loss: 0.0406 - accuracy: 0.9900 - val_loss: 0.2795 - val_accuracy: 0.9417 Epoch 71/150 283/283 [==============================] - 54s 190ms/step - loss: 0.0819 - accuracy: 0.9760 - val_loss: 0.2494 - val_accuracy: 0.9503 Epoch 72/150 283/283 [==============================] - 54s 191ms/step - loss: 0.0272 - accuracy: 0.9920 - val_loss: 0.2267 - val_accuracy: 0.9597 Epoch 73/150 283/283 [==============================] - 52s 183ms/step - loss: 0.0185 - accuracy: 0.9950 - val_loss: 0.2734 - val_accuracy: 0.9473 Epoch 74/150 283/283 [==============================] - 52s 183ms/step - loss: 0.0193 - accuracy: 0.9944 - val_loss: 0.2074 - val_accuracy: 0.9590 Epoch 75/150 283/283 [==============================] - 52s 181ms/step - loss: 0.0135 - accuracy: 0.9969 - val_loss: 0.2891 - val_accuracy: 0.9347 Epoch 76/150 283/283 [==============================] - 52s 181ms/step - loss: 0.0550 - accuracy: 0.9836 - val_loss: 0.2542 - val_accuracy: 0.9453 Epoch 77/150 283/283 [==============================] - 51s 180ms/step - loss: 0.0477 - accuracy: 0.9869 - val_loss: 0.2344 - val_accuracy: 0.9503 Epoch 78/150 283/283 [==============================] - 52s 181ms/step - loss: 0.0327 - accuracy: 0.9908 - val_loss: 0.1674 - val_accuracy: 0.9627 Epoch 79/150 283/283 [==============================] - 52s 181ms/step - loss: 0.0196 - accuracy: 0.9955 - val_loss: 0.3024 - val_accuracy: 0.9400 Epoch 80/150 283/283 [==============================] - 51s 180ms/step - loss: 0.0290 - accuracy: 0.9907 - val_loss: 0.2208 - val_accuracy: 0.9523 Epoch 81/150 283/283 [==============================] - 52s 181ms/step - loss: 0.0404 - accuracy: 0.9878 - val_loss: 0.2688 - val_accuracy: 0.9437 Epoch 82/150 283/283 [==============================] - 51s 180ms/step - loss: 0.0235 - accuracy: 0.9937 - val_loss: 0.3683 - val_accuracy: 0.9283 Epoch 83/150 283/283 [==============================] - 51s 179ms/step - loss: 0.0115 - accuracy: 0.9975 - val_loss: 0.2035 - val_accuracy: 0.9587 Epoch 84/150 283/283 [==============================] - 52s 181ms/step - loss: 0.0322 - accuracy: 0.9885 - val_loss: 0.2437 - val_accuracy: 0.9513 Epoch 85/150 283/283 [==============================] - 52s 184ms/step - loss: 0.0402 - accuracy: 0.9894 - val_loss: 0.2252 - val_accuracy: 0.9573 Epoch 86/150 283/283 [==============================] - 52s 181ms/step - loss: 0.0201 - accuracy: 0.9939 - val_loss: 0.2198 - val_accuracy: 0.9537 Epoch 87/150 283/283 [==============================] - 52s 181ms/step - loss: 0.0130 - accuracy: 0.9965 - val_loss: 0.1982 - val_accuracy: 0.9583 Epoch 88/150 283/283 [==============================] - 52s 182ms/step - loss: 0.0147 - accuracy: 0.9957 - val_loss: 0.2318 - val_accuracy: 0.9530 Epoch 89/150 283/283 [==============================] - 51s 180ms/step - loss: 0.0199 - accuracy: 0.9945 - val_loss: 0.2571 - val_accuracy: 0.9527 Epoch 90/150 283/283 [==============================] - 52s 181ms/step - loss: 0.0192 - accuracy: 0.9949 - val_loss: 0.3401 - val_accuracy: 0.9293 Epoch 91/150 283/283 [==============================] - 52s 182ms/step - loss: 0.0615 - accuracy: 0.9822 - val_loss: 0.4606 - val_accuracy: 0.8917 Epoch 92/150 283/283 [==============================] - 52s 183ms/step - loss: 0.0204 - accuracy: 0.9944 - val_loss: 0.2258 - val_accuracy: 0.9557 Epoch 93/150 283/283 [==============================] - 52s 183ms/step - loss: 0.0252 - accuracy: 0.9921 - val_loss: 0.2199 - val_accuracy: 0.9563 Epoch 94/150 283/283 [==============================] - 52s 182ms/step - loss: 0.0094 - accuracy: 0.9979 - val_loss: 0.2388 - val_accuracy: 0.9503 Epoch 95/150 283/283 [==============================] - 52s 182ms/step - loss: 0.0818 - accuracy: 0.9801 - val_loss: 0.2465 - val_accuracy: 0.9477 Epoch 96/150 283/283 [==============================] - 52s 182ms/step - loss: 0.0154 - accuracy: 0.9963 - val_loss: 0.2886 - val_accuracy: 0.9450 Epoch 97/150 283/283 [==============================] - 52s 182ms/step - loss: 0.0262 - accuracy: 0.9926 - val_loss: 0.2160 - val_accuracy: 0.9563 Epoch 98/150 283/283 [==============================] - 52s 181ms/step - loss: 0.0507 - accuracy: 0.9864 - val_loss: 0.2800 - val_accuracy: 0.9437 Epoch 99/150 283/283 [==============================] - 52s 181ms/step - loss: 0.0488 - accuracy: 0.9840 - val_loss: 0.2641 - val_accuracy: 0.9440 Epoch 100/150 283/283 [==============================] - 52s 182ms/step - loss: 0.0169 - accuracy: 0.9960 - val_loss: 0.2034 - val_accuracy: 0.9600 Epoch 101/150 283/283 [==============================] - 52s 181ms/step - loss: 0.0090 - accuracy: 0.9986 - val_loss: 0.1896 - val_accuracy: 0.9607 Epoch 102/150 283/283 [==============================] - 52s 181ms/step - loss: 0.0369 - accuracy: 0.9895 - val_loss: 0.2093 - val_accuracy: 0.9570 Epoch 103/150 283/283 [==============================] - 52s 182ms/step - loss: 0.0190 - accuracy: 0.9946 - val_loss: 0.2460 - val_accuracy: 0.9493 Epoch 104/150 283/283 [==============================] - 52s 182ms/step - loss: 0.0199 - accuracy: 0.9947 - val_loss: 0.2078 - val_accuracy: 0.9623 Epoch 105/150 283/283 [==============================] - 52s 183ms/step - loss: 0.0094 - accuracy: 0.9978 - val_loss: 0.2042 - val_accuracy: 0.9643 Epoch 106/150 283/283 [==============================] - 52s 181ms/step - loss: 0.0307 - accuracy: 0.9910 - val_loss: 0.2227 - val_accuracy: 0.9507 Epoch 107/150 283/283 [==============================] - 55s 192ms/step - loss: 0.0312 - accuracy: 0.9894 - val_loss: 0.2051 - val_accuracy: 0.9603 Epoch 108/150 283/283 [==============================] - 52s 181ms/step - loss: 0.0101 - accuracy: 0.9978 - val_loss: 0.2479 - val_accuracy: 0.9527
history = res128_final_aug.fit(aug_res128_train, validation_data = std_res128_validation, epochs = 150, callbacks = earlyStopper)
cnnf.model_metric_graph(history, res128_final_aug, std_res31_validation, name = "adapted VGG-16 (with augmented images)", class_names = class_names)
Epoch 1/150 283/283 [==============================] - 58s 198ms/step - loss: 2.4222 - accuracy: 0.2112 - val_loss: 3.4832 - val_accuracy: 0.0667 Epoch 2/150 283/283 [==============================] - 56s 197ms/step - loss: 1.8777 - accuracy: 0.3677 - val_loss: 2.4124 - val_accuracy: 0.2070 Epoch 3/150 283/283 [==============================] - 56s 196ms/step - loss: 1.6010 - accuracy: 0.4747 - val_loss: 2.6549 - val_accuracy: 0.2847 Epoch 4/150 283/283 [==============================] - 56s 197ms/step - loss: 1.4333 - accuracy: 0.5239 - val_loss: 2.4287 - val_accuracy: 0.3647 Epoch 5/150 283/283 [==============================] - 56s 197ms/step - loss: 1.3196 - accuracy: 0.5670 - val_loss: 1.3495 - val_accuracy: 0.5473 Epoch 6/150 283/283 [==============================] - 56s 197ms/step - loss: 1.1703 - accuracy: 0.6141 - val_loss: 1.7751 - val_accuracy: 0.4633 Epoch 7/150 283/283 [==============================] - 56s 197ms/step - loss: 1.1085 - accuracy: 0.6400 - val_loss: 1.8938 - val_accuracy: 0.4240 Epoch 8/150 283/283 [==============================] - 56s 198ms/step - loss: 0.9717 - accuracy: 0.6843 - val_loss: 2.3660 - val_accuracy: 0.4623 Epoch 9/150 283/283 [==============================] - 56s 196ms/step - loss: 0.9012 - accuracy: 0.7141 - val_loss: 1.2971 - val_accuracy: 0.6227 Epoch 10/150 283/283 [==============================] - 56s 197ms/step - loss: 0.8414 - accuracy: 0.7292 - val_loss: 2.4885 - val_accuracy: 0.4290 Epoch 11/150 283/283 [==============================] - 56s 198ms/step - loss: 0.7244 - accuracy: 0.7683 - val_loss: 1.5485 - val_accuracy: 0.5453 Epoch 12/150 283/283 [==============================] - 56s 198ms/step - loss: 0.6947 - accuracy: 0.7780 - val_loss: 1.0486 - val_accuracy: 0.6657 Epoch 13/150 283/283 [==============================] - 56s 197ms/step - loss: 0.6312 - accuracy: 0.8023 - val_loss: 0.8162 - val_accuracy: 0.7563 Epoch 14/150 283/283 [==============================] - 56s 198ms/step - loss: 0.5585 - accuracy: 0.8208 - val_loss: 0.7568 - val_accuracy: 0.7670 Epoch 15/150 283/283 [==============================] - 56s 197ms/step - loss: 0.5660 - accuracy: 0.8160 - val_loss: 0.7518 - val_accuracy: 0.7610 Epoch 16/150 283/283 [==============================] - 56s 197ms/step - loss: 0.5067 - accuracy: 0.8406 - val_loss: 1.0937 - val_accuracy: 0.6810 Epoch 17/150 283/283 [==============================] - 59s 207ms/step - loss: 0.4534 - accuracy: 0.8500 - val_loss: 1.2870 - val_accuracy: 0.6883 Epoch 18/150 283/283 [==============================] - 57s 198ms/step - loss: 0.4493 - accuracy: 0.8531 - val_loss: 0.5108 - val_accuracy: 0.8390 Epoch 19/150 283/283 [==============================] - 56s 197ms/step - loss: 0.3931 - accuracy: 0.8763 - val_loss: 0.6002 - val_accuracy: 0.8253 Epoch 20/150 283/283 [==============================] - 56s 197ms/step - loss: 0.3460 - accuracy: 0.8889 - val_loss: 0.9029 - val_accuracy: 0.7457 Epoch 21/150 283/283 [==============================] - 56s 197ms/step - loss: 0.3702 - accuracy: 0.8868 - val_loss: 0.4910 - val_accuracy: 0.8530 Epoch 22/150 283/283 [==============================] - 56s 198ms/step - loss: 0.3264 - accuracy: 0.9012 - val_loss: 0.6350 - val_accuracy: 0.8187 Epoch 23/150 283/283 [==============================] - 56s 197ms/step - loss: 0.3166 - accuracy: 0.8994 - val_loss: 0.4755 - val_accuracy: 0.8570 Epoch 24/150 283/283 [==============================] - 56s 196ms/step - loss: 0.3047 - accuracy: 0.9077 - val_loss: 0.4231 - val_accuracy: 0.8700 Epoch 25/150 283/283 [==============================] - 56s 195ms/step - loss: 0.2992 - accuracy: 0.9084 - val_loss: 0.2325 - val_accuracy: 0.9263 Epoch 26/150 283/283 [==============================] - 56s 195ms/step - loss: 0.2672 - accuracy: 0.9184 - val_loss: 0.9854 - val_accuracy: 0.7580 Epoch 27/150 283/283 [==============================] - 56s 198ms/step - loss: 0.2520 - accuracy: 0.9255 - val_loss: 0.6002 - val_accuracy: 0.8443 Epoch 28/150 283/283 [==============================] - 57s 198ms/step - loss: 0.2324 - accuracy: 0.9284 - val_loss: 0.3202 - val_accuracy: 0.8963 Epoch 29/150 283/283 [==============================] - 56s 197ms/step - loss: 0.2265 - accuracy: 0.9294 - val_loss: 0.6791 - val_accuracy: 0.8303 Epoch 30/150 283/283 [==============================] - 57s 199ms/step - loss: 0.2239 - accuracy: 0.9313 - val_loss: 0.2150 - val_accuracy: 0.9390 Epoch 31/150 283/283 [==============================] - 57s 199ms/step - loss: 0.1979 - accuracy: 0.9356 - val_loss: 0.4534 - val_accuracy: 0.8563 Epoch 32/150 283/283 [==============================] - 57s 198ms/step - loss: 0.1981 - accuracy: 0.9386 - val_loss: 1.0555 - val_accuracy: 0.7063 Epoch 33/150 283/283 [==============================] - 56s 198ms/step - loss: 0.1903 - accuracy: 0.9403 - val_loss: 1.0971 - val_accuracy: 0.7640 Epoch 34/150 283/283 [==============================] - 56s 196ms/step - loss: 0.2054 - accuracy: 0.9385 - val_loss: 0.5920 - val_accuracy: 0.8407 Epoch 35/150 283/283 [==============================] - 56s 197ms/step - loss: 0.1677 - accuracy: 0.9473 - val_loss: 0.7191 - val_accuracy: 0.8080 Epoch 36/150 283/283 [==============================] - 56s 196ms/step - loss: 0.1876 - accuracy: 0.9438 - val_loss: 0.5993 - val_accuracy: 0.8380 Epoch 37/150 283/283 [==============================] - 56s 197ms/step - loss: 0.1551 - accuracy: 0.9521 - val_loss: 0.2037 - val_accuracy: 0.9420 Epoch 38/150 283/283 [==============================] - 56s 198ms/step - loss: 0.1623 - accuracy: 0.9495 - val_loss: 0.3490 - val_accuracy: 0.9077 Epoch 39/150 283/283 [==============================] - 57s 198ms/step - loss: 0.1430 - accuracy: 0.9572 - val_loss: 0.2632 - val_accuracy: 0.9197 Epoch 40/150 283/283 [==============================] - 56s 197ms/step - loss: 0.1412 - accuracy: 0.9557 - val_loss: 0.3002 - val_accuracy: 0.9130 Epoch 41/150 283/283 [==============================] - 56s 197ms/step - loss: 0.1273 - accuracy: 0.9593 - val_loss: 0.5748 - val_accuracy: 0.8693 Epoch 42/150 283/283 [==============================] - 56s 198ms/step - loss: 0.1463 - accuracy: 0.9545 - val_loss: 0.3179 - val_accuracy: 0.9157 Epoch 43/150 283/283 [==============================] - 56s 197ms/step - loss: 0.1389 - accuracy: 0.9543 - val_loss: 0.4218 - val_accuracy: 0.8803 Epoch 44/150 283/283 [==============================] - 56s 197ms/step - loss: 0.1447 - accuracy: 0.9569 - val_loss: 0.1932 - val_accuracy: 0.9493 Epoch 45/150 283/283 [==============================] - 56s 196ms/step - loss: 0.1323 - accuracy: 0.9575 - val_loss: 0.2755 - val_accuracy: 0.9163 Epoch 46/150 283/283 [==============================] - 56s 196ms/step - loss: 0.1267 - accuracy: 0.9608 - val_loss: 0.2934 - val_accuracy: 0.9267 Epoch 47/150 283/283 [==============================] - 56s 196ms/step - loss: 0.1251 - accuracy: 0.9606 - val_loss: 0.2220 - val_accuracy: 0.9467 Epoch 48/150 283/283 [==============================] - 56s 196ms/step - loss: 0.1126 - accuracy: 0.9656 - val_loss: 0.1647 - val_accuracy: 0.9563 Epoch 49/150 283/283 [==============================] - 56s 197ms/step - loss: 0.1418 - accuracy: 0.9553 - val_loss: 0.1860 - val_accuracy: 0.9560 Epoch 50/150 283/283 [==============================] - 56s 197ms/step - loss: 0.1275 - accuracy: 0.9613 - val_loss: 0.5491 - val_accuracy: 0.8623 Epoch 51/150 283/283 [==============================] - 56s 197ms/step - loss: 0.1110 - accuracy: 0.9649 - val_loss: 0.1730 - val_accuracy: 0.9563 Epoch 52/150 283/283 [==============================] - 56s 197ms/step - loss: 0.1188 - accuracy: 0.9620 - val_loss: 0.2401 - val_accuracy: 0.9347 Epoch 53/150 283/283 [==============================] - 56s 198ms/step - loss: 0.0970 - accuracy: 0.9695 - val_loss: 0.3167 - val_accuracy: 0.9210 Epoch 54/150 283/283 [==============================] - 58s 202ms/step - loss: 0.0994 - accuracy: 0.9694 - val_loss: 0.2470 - val_accuracy: 0.9313 Epoch 55/150 283/283 [==============================] - 56s 196ms/step - loss: 0.1151 - accuracy: 0.9663 - val_loss: 0.1210 - val_accuracy: 0.9643 Epoch 56/150 283/283 [==============================] - 56s 196ms/step - loss: 0.1096 - accuracy: 0.9675 - val_loss: 0.1423 - val_accuracy: 0.9593 Epoch 57/150 283/283 [==============================] - 55s 194ms/step - loss: 0.0804 - accuracy: 0.9757 - val_loss: 0.1355 - val_accuracy: 0.9623 Epoch 58/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0864 - accuracy: 0.9754 - val_loss: 0.1665 - val_accuracy: 0.9563 Epoch 59/150 283/283 [==============================] - 55s 195ms/step - loss: 0.1069 - accuracy: 0.9691 - val_loss: 0.1947 - val_accuracy: 0.9507 Epoch 60/150 283/283 [==============================] - 55s 194ms/step - loss: 0.0906 - accuracy: 0.9722 - val_loss: 0.1477 - val_accuracy: 0.9577 Epoch 61/150 283/283 [==============================] - 55s 194ms/step - loss: 0.0885 - accuracy: 0.9737 - val_loss: 0.1305 - val_accuracy: 0.9663 Epoch 62/150 283/283 [==============================] - 56s 195ms/step - loss: 0.1052 - accuracy: 0.9681 - val_loss: 0.0827 - val_accuracy: 0.9730 Epoch 63/150 283/283 [==============================] - 56s 196ms/step - loss: 0.0765 - accuracy: 0.9750 - val_loss: 0.1062 - val_accuracy: 0.9680 Epoch 64/150 283/283 [==============================] - 55s 194ms/step - loss: 0.0900 - accuracy: 0.9721 - val_loss: 0.1793 - val_accuracy: 0.9530 Epoch 65/150 283/283 [==============================] - 55s 195ms/step - loss: 0.0788 - accuracy: 0.9749 - val_loss: 0.0865 - val_accuracy: 0.9810 Epoch 66/150 283/283 [==============================] - 55s 194ms/step - loss: 0.1044 - accuracy: 0.9731 - val_loss: 0.2020 - val_accuracy: 0.9450 Epoch 67/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0992 - accuracy: 0.9688 - val_loss: 0.1543 - val_accuracy: 0.9570 Epoch 68/150 283/283 [==============================] - 55s 194ms/step - loss: 0.0733 - accuracy: 0.9776 - val_loss: 0.1040 - val_accuracy: 0.9713 Epoch 69/150 283/283 [==============================] - 55s 194ms/step - loss: 0.0603 - accuracy: 0.9801 - val_loss: 0.1224 - val_accuracy: 0.9667 Epoch 70/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0677 - accuracy: 0.9802 - val_loss: 0.1683 - val_accuracy: 0.9547 Epoch 71/150 283/283 [==============================] - 55s 195ms/step - loss: 0.0582 - accuracy: 0.9831 - val_loss: 0.2791 - val_accuracy: 0.9267 Epoch 72/150 283/283 [==============================] - 58s 205ms/step - loss: 0.1021 - accuracy: 0.9701 - val_loss: 0.1065 - val_accuracy: 0.9750 Epoch 73/150 283/283 [==============================] - 58s 197ms/step - loss: 0.0459 - accuracy: 0.9862 - val_loss: 0.1241 - val_accuracy: 0.9693 Epoch 74/150 283/283 [==============================] - 55s 193ms/step - loss: 0.0687 - accuracy: 0.9787 - val_loss: 0.1492 - val_accuracy: 0.9550 Epoch 75/150 283/283 [==============================] - 56s 196ms/step - loss: 0.0755 - accuracy: 0.9771 - val_loss: 0.2038 - val_accuracy: 0.9453 Epoch 76/150 283/283 [==============================] - 55s 194ms/step - loss: 0.0794 - accuracy: 0.9772 - val_loss: 0.1972 - val_accuracy: 0.9450 Epoch 77/150 283/283 [==============================] - 55s 194ms/step - loss: 0.0603 - accuracy: 0.9822 - val_loss: 0.1846 - val_accuracy: 0.9500 Epoch 78/150 283/283 [==============================] - 55s 194ms/step - loss: 0.0840 - accuracy: 0.9742 - val_loss: 0.1316 - val_accuracy: 0.9640 Epoch 79/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0560 - accuracy: 0.9844 - val_loss: 0.0781 - val_accuracy: 0.9760 Epoch 80/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0965 - accuracy: 0.9702 - val_loss: 0.1219 - val_accuracy: 0.9660 Epoch 81/150 283/283 [==============================] - 56s 196ms/step - loss: 0.0535 - accuracy: 0.9839 - val_loss: 0.0975 - val_accuracy: 0.9713 Epoch 82/150 283/283 [==============================] - 55s 194ms/step - loss: 0.0619 - accuracy: 0.9804 - val_loss: 0.1325 - val_accuracy: 0.9640 Epoch 83/150 283/283 [==============================] - 56s 196ms/step - loss: 0.0538 - accuracy: 0.9842 - val_loss: 0.0987 - val_accuracy: 0.9757 Epoch 84/150 283/283 [==============================] - 56s 196ms/step - loss: 0.0593 - accuracy: 0.9805 - val_loss: 0.1192 - val_accuracy: 0.9660 Epoch 85/150 283/283 [==============================] - 56s 196ms/step - loss: 0.0703 - accuracy: 0.9785 - val_loss: 0.0905 - val_accuracy: 0.9757 Epoch 86/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0456 - accuracy: 0.9865 - val_loss: 0.1438 - val_accuracy: 0.9647 Epoch 87/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0556 - accuracy: 0.9817 - val_loss: 0.0963 - val_accuracy: 0.9737 Epoch 88/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0631 - accuracy: 0.9812 - val_loss: 0.0567 - val_accuracy: 0.9850 Epoch 89/150 283/283 [==============================] - 55s 194ms/step - loss: 0.0458 - accuracy: 0.9864 - val_loss: 0.1353 - val_accuracy: 0.9663 Epoch 90/150 283/283 [==============================] - 55s 194ms/step - loss: 0.0750 - accuracy: 0.9774 - val_loss: 0.1365 - val_accuracy: 0.9643 Epoch 91/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0503 - accuracy: 0.9854 - val_loss: 0.1098 - val_accuracy: 0.9673 Epoch 92/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0485 - accuracy: 0.9853 - val_loss: 0.1387 - val_accuracy: 0.9643 Epoch 93/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0530 - accuracy: 0.9837 - val_loss: 0.1271 - val_accuracy: 0.9673 Epoch 94/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0578 - accuracy: 0.9824 - val_loss: 0.0674 - val_accuracy: 0.9813 Epoch 95/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0499 - accuracy: 0.9850 - val_loss: 0.1024 - val_accuracy: 0.9700 Epoch 96/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0637 - accuracy: 0.9798 - val_loss: 0.1325 - val_accuracy: 0.9650 Epoch 97/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0748 - accuracy: 0.9774 - val_loss: 0.0736 - val_accuracy: 0.9793 Epoch 98/150 283/283 [==============================] - 56s 196ms/step - loss: 0.0603 - accuracy: 0.9818 - val_loss: 0.1096 - val_accuracy: 0.9733 Epoch 99/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0531 - accuracy: 0.9843 - val_loss: 0.0947 - val_accuracy: 0.9753 Epoch 100/150 283/283 [==============================] - 55s 194ms/step - loss: 0.0482 - accuracy: 0.9854 - val_loss: 0.1098 - val_accuracy: 0.9700 Epoch 101/150 283/283 [==============================] - 55s 194ms/step - loss: 0.0467 - accuracy: 0.9862 - val_loss: 0.0929 - val_accuracy: 0.9757 Epoch 102/150 283/283 [==============================] - 55s 194ms/step - loss: 0.0395 - accuracy: 0.9879 - val_loss: 0.1079 - val_accuracy: 0.9717 Epoch 103/150 283/283 [==============================] - 55s 194ms/step - loss: 0.0597 - accuracy: 0.9821 - val_loss: 0.0583 - val_accuracy: 0.9827 Epoch 104/150 283/283 [==============================] - 55s 195ms/step - loss: 0.0368 - accuracy: 0.9884 - val_loss: 0.1852 - val_accuracy: 0.9550 Epoch 105/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0416 - accuracy: 0.9885 - val_loss: 0.1173 - val_accuracy: 0.9707 Epoch 106/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0538 - accuracy: 0.9840 - val_loss: 0.1070 - val_accuracy: 0.9730 Epoch 107/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0609 - accuracy: 0.9824 - val_loss: 0.0792 - val_accuracy: 0.9777 Epoch 108/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0491 - accuracy: 0.9865 - val_loss: 0.0543 - val_accuracy: 0.9870 Epoch 109/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0513 - accuracy: 0.9848 - val_loss: 0.0525 - val_accuracy: 0.9873 Epoch 110/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0371 - accuracy: 0.9890 - val_loss: 0.0707 - val_accuracy: 0.9817 Epoch 111/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0293 - accuracy: 0.9930 - val_loss: 0.2009 - val_accuracy: 0.9417 Epoch 112/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0493 - accuracy: 0.9850 - val_loss: 0.0778 - val_accuracy: 0.9757 Epoch 113/150 283/283 [==============================] - 56s 197ms/step - loss: 0.0407 - accuracy: 0.9890 - val_loss: 0.1273 - val_accuracy: 0.9643 Epoch 114/150 283/283 [==============================] - 58s 202ms/step - loss: 0.0564 - accuracy: 0.9824 - val_loss: 0.0723 - val_accuracy: 0.9810 Epoch 115/150 283/283 [==============================] - 55s 194ms/step - loss: 0.0458 - accuracy: 0.9872 - val_loss: 0.0668 - val_accuracy: 0.9843 Epoch 116/150 283/283 [==============================] - 55s 194ms/step - loss: 0.0603 - accuracy: 0.9818 - val_loss: 0.0892 - val_accuracy: 0.9760 Epoch 117/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0382 - accuracy: 0.9887 - val_loss: 0.0747 - val_accuracy: 0.9800 Epoch 118/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0428 - accuracy: 0.9870 - val_loss: 0.2627 - val_accuracy: 0.9410 Epoch 119/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0285 - accuracy: 0.9911 - val_loss: 0.4958 - val_accuracy: 0.9027 Epoch 120/150 283/283 [==============================] - 56s 196ms/step - loss: 0.0223 - accuracy: 0.9929 - val_loss: 0.0963 - val_accuracy: 0.9790 Epoch 121/150 283/283 [==============================] - 56s 196ms/step - loss: 0.0536 - accuracy: 0.9840 - val_loss: 0.1485 - val_accuracy: 0.9667 Epoch 122/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0504 - accuracy: 0.9870 - val_loss: 0.0860 - val_accuracy: 0.9750 Epoch 123/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0427 - accuracy: 0.9870 - val_loss: 0.0987 - val_accuracy: 0.9753 Epoch 124/150 283/283 [==============================] - 56s 196ms/step - loss: 0.0321 - accuracy: 0.9912 - val_loss: 0.0631 - val_accuracy: 0.9860 Epoch 125/150 283/283 [==============================] - 56s 196ms/step - loss: 0.0337 - accuracy: 0.9909 - val_loss: 0.1287 - val_accuracy: 0.9707 Epoch 126/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0437 - accuracy: 0.9862 - val_loss: 0.1158 - val_accuracy: 0.9747 Epoch 127/150 283/283 [==============================] - 55s 195ms/step - loss: 0.0383 - accuracy: 0.9896 - val_loss: 0.0723 - val_accuracy: 0.9833 Epoch 128/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0324 - accuracy: 0.9903 - val_loss: 0.0689 - val_accuracy: 0.9803 Epoch 129/150 283/283 [==============================] - 55s 194ms/step - loss: 0.0440 - accuracy: 0.9880 - val_loss: 0.0967 - val_accuracy: 0.9777 Epoch 130/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0544 - accuracy: 0.9860 - val_loss: 0.0804 - val_accuracy: 0.9763 Epoch 131/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0477 - accuracy: 0.9863 - val_loss: 0.1692 - val_accuracy: 0.9580 Epoch 132/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0379 - accuracy: 0.9896 - val_loss: 0.0773 - val_accuracy: 0.9833 Epoch 133/150 283/283 [==============================] - 56s 196ms/step - loss: 0.0268 - accuracy: 0.9927 - val_loss: 0.0545 - val_accuracy: 0.9860 Epoch 134/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0476 - accuracy: 0.9878 - val_loss: 0.1215 - val_accuracy: 0.9697 Epoch 135/150 283/283 [==============================] - 56s 196ms/step - loss: 0.0382 - accuracy: 0.9897 - val_loss: 0.0802 - val_accuracy: 0.9843 Epoch 136/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0477 - accuracy: 0.9853 - val_loss: 0.0616 - val_accuracy: 0.9843 Epoch 137/150 283/283 [==============================] - 56s 195ms/step - loss: 0.0408 - accuracy: 0.9877 - val_loss: 0.0611 - val_accuracy: 0.9863 Epoch 138/150 283/283 [==============================] - 56s 196ms/step - loss: 0.0199 - accuracy: 0.9952 - val_loss: 0.1080 - val_accuracy: 0.9780 Epoch 139/150 283/283 [==============================] - 56s 196ms/step - loss: 0.0430 - accuracy: 0.9887 - val_loss: 0.0823 - val_accuracy: 0.9790
history = res128_final_cw_aug.fit(aug_res128_train, validation_data = std_res128_validation, epochs = 150, callbacks = earlyStopper, class_weight = class_weights)
cnnf.model_metric_graph(history, res128_final_cw_aug, std_res31_validation, name = "adapted VGG-16 (with class weights and augmented images applied)", class_names = class_names)
Epoch 1/150 283/283 [==============================] - 52s 176ms/step - loss: 2.5004 - accuracy: 0.1635 - val_loss: 3.5651 - val_accuracy: 0.0773 Epoch 2/150 283/283 [==============================] - 49s 173ms/step - loss: 2.0175 - accuracy: 0.3209 - val_loss: 2.1078 - val_accuracy: 0.2823 Epoch 3/150 283/283 [==============================] - 49s 172ms/step - loss: 1.8039 - accuracy: 0.3949 - val_loss: 3.6867 - val_accuracy: 0.2447 Epoch 4/150 283/283 [==============================] - 49s 171ms/step - loss: 1.7024 - accuracy: 0.4342 - val_loss: 3.4413 - val_accuracy: 0.2903 Epoch 5/150 283/283 [==============================] - 49s 172ms/step - loss: 1.5026 - accuracy: 0.5006 - val_loss: 1.5824 - val_accuracy: 0.4677 Epoch 6/150 283/283 [==============================] - 49s 171ms/step - loss: 1.3714 - accuracy: 0.5439 - val_loss: 2.2896 - val_accuracy: 0.3827 Epoch 7/150 283/283 [==============================] - 49s 172ms/step - loss: 1.2926 - accuracy: 0.5840 - val_loss: 1.4818 - val_accuracy: 0.5037 Epoch 8/150 283/283 [==============================] - 49s 172ms/step - loss: 1.1617 - accuracy: 0.6238 - val_loss: 1.3014 - val_accuracy: 0.6030 Epoch 9/150 283/283 [==============================] - 49s 171ms/step - loss: 1.0591 - accuracy: 0.6540 - val_loss: 1.5194 - val_accuracy: 0.5337 Epoch 10/150 283/283 [==============================] - 49s 170ms/step - loss: 0.9973 - accuracy: 0.6708 - val_loss: 1.7059 - val_accuracy: 0.5457 Epoch 11/150 283/283 [==============================] - 49s 172ms/step - loss: 0.9342 - accuracy: 0.6938 - val_loss: 1.2007 - val_accuracy: 0.6120 Epoch 12/150 283/283 [==============================] - 49s 171ms/step - loss: 0.8907 - accuracy: 0.7111 - val_loss: 1.3605 - val_accuracy: 0.5957 Epoch 13/150 283/283 [==============================] - 49s 173ms/step - loss: 0.8066 - accuracy: 0.7337 - val_loss: 1.6680 - val_accuracy: 0.6007 Epoch 14/150 283/283 [==============================] - 49s 173ms/step - loss: 0.7692 - accuracy: 0.7493 - val_loss: 1.1905 - val_accuracy: 0.6353 Epoch 15/150 283/283 [==============================] - 50s 174ms/step - loss: 0.6997 - accuracy: 0.7696 - val_loss: 1.1700 - val_accuracy: 0.6907 Epoch 16/150 283/283 [==============================] - 50s 174ms/step - loss: 0.6801 - accuracy: 0.7806 - val_loss: 0.9963 - val_accuracy: 0.6943 Epoch 17/150 283/283 [==============================] - 49s 173ms/step - loss: 0.6016 - accuracy: 0.8035 - val_loss: 1.3738 - val_accuracy: 0.6400 Epoch 18/150 283/283 [==============================] - 49s 172ms/step - loss: 0.6011 - accuracy: 0.8027 - val_loss: 0.9628 - val_accuracy: 0.7353 Epoch 19/150 283/283 [==============================] - 49s 171ms/step - loss: 0.5178 - accuracy: 0.8365 - val_loss: 0.9411 - val_accuracy: 0.7120 Epoch 20/150 283/283 [==============================] - 50s 174ms/step - loss: 0.5034 - accuracy: 0.8383 - val_loss: 0.5420 - val_accuracy: 0.8273 Epoch 21/150 283/283 [==============================] - 49s 171ms/step - loss: 0.4621 - accuracy: 0.8562 - val_loss: 0.6760 - val_accuracy: 0.8007 Epoch 22/150 283/283 [==============================] - 48s 169ms/step - loss: 0.4097 - accuracy: 0.8746 - val_loss: 0.6002 - val_accuracy: 0.8207 Epoch 23/150 283/283 [==============================] - 49s 173ms/step - loss: 0.3680 - accuracy: 0.8818 - val_loss: 0.5455 - val_accuracy: 0.8387 Epoch 24/150 283/283 [==============================] - 49s 173ms/step - loss: 0.3573 - accuracy: 0.8867 - val_loss: 0.3993 - val_accuracy: 0.8833 Epoch 25/150 283/283 [==============================] - 50s 175ms/step - loss: 0.3461 - accuracy: 0.8907 - val_loss: 0.3311 - val_accuracy: 0.8970 Epoch 26/150 283/283 [==============================] - 49s 172ms/step - loss: 0.3201 - accuracy: 0.8958 - val_loss: 0.3173 - val_accuracy: 0.9050 Epoch 27/150 283/283 [==============================] - 48s 170ms/step - loss: 0.2840 - accuracy: 0.9071 - val_loss: 0.2990 - val_accuracy: 0.9093 Epoch 28/150 283/283 [==============================] - 49s 172ms/step - loss: 0.2946 - accuracy: 0.9078 - val_loss: 0.9116 - val_accuracy: 0.7397 Epoch 29/150 283/283 [==============================] - 49s 171ms/step - loss: 0.3083 - accuracy: 0.9021 - val_loss: 0.3049 - val_accuracy: 0.9083 Epoch 30/150 283/283 [==============================] - 49s 171ms/step - loss: 0.2673 - accuracy: 0.9157 - val_loss: 0.2936 - val_accuracy: 0.9123 Epoch 31/150 283/283 [==============================] - 48s 170ms/step - loss: 0.2574 - accuracy: 0.9171 - val_loss: 0.3694 - val_accuracy: 0.8930 Epoch 32/150 283/283 [==============================] - 49s 172ms/step - loss: 0.2057 - accuracy: 0.9329 - val_loss: 0.4277 - val_accuracy: 0.8797 Epoch 33/150 283/283 [==============================] - 49s 173ms/step - loss: 0.2190 - accuracy: 0.9302 - val_loss: 0.3843 - val_accuracy: 0.8887 Epoch 34/150 283/283 [==============================] - 49s 171ms/step - loss: 0.2292 - accuracy: 0.9269 - val_loss: 0.5753 - val_accuracy: 0.8563 Epoch 35/150 283/283 [==============================] - 49s 171ms/step - loss: 0.2119 - accuracy: 0.9337 - val_loss: 0.2298 - val_accuracy: 0.9367 Epoch 36/150 283/283 [==============================] - 49s 172ms/step - loss: 0.2260 - accuracy: 0.9307 - val_loss: 0.2862 - val_accuracy: 0.9147 Epoch 37/150 283/283 [==============================] - 48s 169ms/step - loss: 0.1793 - accuracy: 0.9405 - val_loss: 0.2175 - val_accuracy: 0.9370 Epoch 38/150 283/283 [==============================] - 49s 173ms/step - loss: 0.1777 - accuracy: 0.9451 - val_loss: 1.2564 - val_accuracy: 0.7143 Epoch 39/150 283/283 [==============================] - 48s 169ms/step - loss: 0.1907 - accuracy: 0.9413 - val_loss: 0.2530 - val_accuracy: 0.9307 Epoch 40/150 283/283 [==============================] - 49s 171ms/step - loss: 0.1757 - accuracy: 0.9448 - val_loss: 0.4177 - val_accuracy: 0.8863 Epoch 41/150 283/283 [==============================] - 49s 172ms/step - loss: 0.1819 - accuracy: 0.9414 - val_loss: 0.5381 - val_accuracy: 0.8623 Epoch 42/150 283/283 [==============================] - 49s 170ms/step - loss: 0.2285 - accuracy: 0.9258 - val_loss: 0.3583 - val_accuracy: 0.8967 Epoch 43/150 283/283 [==============================] - 48s 170ms/step - loss: 0.1477 - accuracy: 0.9513 - val_loss: 0.5804 - val_accuracy: 0.8577 Epoch 44/150 283/283 [==============================] - 49s 173ms/step - loss: 0.1469 - accuracy: 0.9525 - val_loss: 0.2819 - val_accuracy: 0.9317 Epoch 45/150 283/283 [==============================] - 49s 170ms/step - loss: 0.1519 - accuracy: 0.9508 - val_loss: 0.3036 - val_accuracy: 0.9123 Epoch 46/150 283/283 [==============================] - 50s 173ms/step - loss: 0.1352 - accuracy: 0.9541 - val_loss: 0.2183 - val_accuracy: 0.9433 Epoch 47/150 283/283 [==============================] - 50s 174ms/step - loss: 0.1521 - accuracy: 0.9545 - val_loss: 0.3952 - val_accuracy: 0.8960 Epoch 48/150 283/283 [==============================] - 49s 172ms/step - loss: 0.1192 - accuracy: 0.9620 - val_loss: 0.1891 - val_accuracy: 0.9483 Epoch 49/150 283/283 [==============================] - 48s 170ms/step - loss: 0.1454 - accuracy: 0.9530 - val_loss: 0.1675 - val_accuracy: 0.9453 Epoch 50/150 283/283 [==============================] - 49s 173ms/step - loss: 0.1351 - accuracy: 0.9603 - val_loss: 0.5026 - val_accuracy: 0.8723 Epoch 51/150 283/283 [==============================] - 49s 171ms/step - loss: 0.1417 - accuracy: 0.9530 - val_loss: 0.9514 - val_accuracy: 0.7700 Epoch 52/150 283/283 [==============================] - 49s 171ms/step - loss: 0.2366 - accuracy: 0.9260 - val_loss: 0.2037 - val_accuracy: 0.9413 Epoch 53/150 283/283 [==============================] - 49s 173ms/step - loss: 0.1590 - accuracy: 0.9488 - val_loss: 0.1885 - val_accuracy: 0.9443 Epoch 54/150 283/283 [==============================] - 49s 172ms/step - loss: 0.1331 - accuracy: 0.9591 - val_loss: 0.1361 - val_accuracy: 0.9580 Epoch 55/150 283/283 [==============================] - 49s 171ms/step - loss: 0.1074 - accuracy: 0.9650 - val_loss: 0.3001 - val_accuracy: 0.9180 Epoch 56/150 283/283 [==============================] - 49s 172ms/step - loss: 0.1461 - accuracy: 0.9568 - val_loss: 0.1865 - val_accuracy: 0.9497 Epoch 57/150 283/283 [==============================] - 49s 173ms/step - loss: 0.1090 - accuracy: 0.9640 - val_loss: 0.1908 - val_accuracy: 0.9523 Epoch 58/150 283/283 [==============================] - 49s 173ms/step - loss: 0.1238 - accuracy: 0.9639 - val_loss: 0.1664 - val_accuracy: 0.9543 Epoch 59/150 283/283 [==============================] - 49s 173ms/step - loss: 0.0975 - accuracy: 0.9671 - val_loss: 0.4894 - val_accuracy: 0.8770 Epoch 60/150 283/283 [==============================] - 50s 173ms/step - loss: 0.1392 - accuracy: 0.9543 - val_loss: 0.2647 - val_accuracy: 0.9270 Epoch 61/150 283/283 [==============================] - 49s 172ms/step - loss: 0.0937 - accuracy: 0.9694 - val_loss: 0.1482 - val_accuracy: 0.9590 Epoch 62/150 283/283 [==============================] - 49s 171ms/step - loss: 0.1195 - accuracy: 0.9651 - val_loss: 0.3310 - val_accuracy: 0.9110 Epoch 63/150 283/283 [==============================] - 49s 173ms/step - loss: 0.0957 - accuracy: 0.9722 - val_loss: 0.1399 - val_accuracy: 0.9640 Epoch 64/150 283/283 [==============================] - 49s 172ms/step - loss: 0.0808 - accuracy: 0.9724 - val_loss: 0.2927 - val_accuracy: 0.9243 Epoch 65/150 283/283 [==============================] - 49s 172ms/step - loss: 0.0876 - accuracy: 0.9721 - val_loss: 0.1358 - val_accuracy: 0.9650 Epoch 66/150 283/283 [==============================] - 49s 172ms/step - loss: 0.0887 - accuracy: 0.9723 - val_loss: 0.6669 - val_accuracy: 0.8390 Epoch 67/150 283/283 [==============================] - 49s 170ms/step - loss: 0.0887 - accuracy: 0.9719 - val_loss: 0.1787 - val_accuracy: 0.9533 Epoch 68/150 283/283 [==============================] - 50s 173ms/step - loss: 0.1057 - accuracy: 0.9677 - val_loss: 0.1420 - val_accuracy: 0.9587 Epoch 69/150 283/283 [==============================] - 49s 171ms/step - loss: 0.0741 - accuracy: 0.9736 - val_loss: 0.1155 - val_accuracy: 0.9723 Epoch 70/150 283/283 [==============================] - 49s 171ms/step - loss: 0.0923 - accuracy: 0.9704 - val_loss: 0.5099 - val_accuracy: 0.9107 Epoch 71/150 283/283 [==============================] - 50s 174ms/step - loss: 0.0895 - accuracy: 0.9722 - val_loss: 0.5233 - val_accuracy: 0.8720 Epoch 72/150 283/283 [==============================] - 48s 169ms/step - loss: 0.0836 - accuracy: 0.9721 - val_loss: 0.5088 - val_accuracy: 0.8873 Epoch 73/150 283/283 [==============================] - 48s 167ms/step - loss: 0.1560 - accuracy: 0.9520 - val_loss: 0.1200 - val_accuracy: 0.9683 Epoch 74/150 283/283 [==============================] - 48s 167ms/step - loss: 0.0851 - accuracy: 0.9716 - val_loss: 0.2701 - val_accuracy: 0.9297 Epoch 75/150 283/283 [==============================] - 50s 175ms/step - loss: 0.0894 - accuracy: 0.9699 - val_loss: 0.2315 - val_accuracy: 0.9337 Epoch 76/150 283/283 [==============================] - 50s 174ms/step - loss: 0.0675 - accuracy: 0.9782 - val_loss: 0.0829 - val_accuracy: 0.9793 Epoch 77/150 283/283 [==============================] - 49s 173ms/step - loss: 0.0490 - accuracy: 0.9828 - val_loss: 0.1383 - val_accuracy: 0.9643 Epoch 78/150 283/283 [==============================] - 50s 174ms/step - loss: 0.0707 - accuracy: 0.9773 - val_loss: 0.1208 - val_accuracy: 0.9690 Epoch 79/150 283/283 [==============================] - 50s 174ms/step - loss: 0.0727 - accuracy: 0.9773 - val_loss: 0.1951 - val_accuracy: 0.9523 Epoch 80/150 283/283 [==============================] - 50s 175ms/step - loss: 0.0627 - accuracy: 0.9780 - val_loss: 0.1535 - val_accuracy: 0.9633 Epoch 81/150 283/283 [==============================] - 49s 170ms/step - loss: 0.0950 - accuracy: 0.9704 - val_loss: 0.1080 - val_accuracy: 0.9727 Epoch 82/150 283/283 [==============================] - 48s 168ms/step - loss: 0.0913 - accuracy: 0.9700 - val_loss: 0.2461 - val_accuracy: 0.9393 Epoch 83/150 283/283 [==============================] - 48s 168ms/step - loss: 0.0634 - accuracy: 0.9791 - val_loss: 0.0867 - val_accuracy: 0.9787 Epoch 84/150 283/283 [==============================] - 48s 170ms/step - loss: 0.0825 - accuracy: 0.9749 - val_loss: 0.1481 - val_accuracy: 0.9593 Epoch 85/150 283/283 [==============================] - 48s 168ms/step - loss: 0.0564 - accuracy: 0.9823 - val_loss: 0.1245 - val_accuracy: 0.9657 Epoch 86/150 283/283 [==============================] - 48s 169ms/step - loss: 0.0611 - accuracy: 0.9808 - val_loss: 0.0954 - val_accuracy: 0.9747 Epoch 87/150 283/283 [==============================] - 48s 169ms/step - loss: 0.1055 - accuracy: 0.9690 - val_loss: 0.1596 - val_accuracy: 0.9557 Epoch 88/150 283/283 [==============================] - 49s 171ms/step - loss: 0.0732 - accuracy: 0.9785 - val_loss: 0.0733 - val_accuracy: 0.9803 Epoch 89/150 283/283 [==============================] - 49s 171ms/step - loss: 0.0508 - accuracy: 0.9831 - val_loss: 0.1036 - val_accuracy: 0.9750 Epoch 90/150 283/283 [==============================] - 49s 170ms/step - loss: 0.0527 - accuracy: 0.9840 - val_loss: 0.1038 - val_accuracy: 0.9697 Epoch 91/150 283/283 [==============================] - 48s 169ms/step - loss: 0.0605 - accuracy: 0.9811 - val_loss: 0.2352 - val_accuracy: 0.9373 Epoch 92/150 283/283 [==============================] - 48s 169ms/step - loss: 0.0784 - accuracy: 0.9750 - val_loss: 0.3550 - val_accuracy: 0.9037 Epoch 93/150 283/283 [==============================] - 48s 169ms/step - loss: 0.0585 - accuracy: 0.9808 - val_loss: 0.1337 - val_accuracy: 0.9663 Epoch 94/150 283/283 [==============================] - 48s 169ms/step - loss: 0.0724 - accuracy: 0.9768 - val_loss: 0.0942 - val_accuracy: 0.9750 Epoch 95/150 283/283 [==============================] - 48s 168ms/step - loss: 0.0485 - accuracy: 0.9856 - val_loss: 0.2701 - val_accuracy: 0.9377 Epoch 96/150 283/283 [==============================] - 49s 171ms/step - loss: 0.0821 - accuracy: 0.9757 - val_loss: 1.3307 - val_accuracy: 0.7247 Epoch 97/150 283/283 [==============================] - 49s 170ms/step - loss: 0.0636 - accuracy: 0.9786 - val_loss: 0.1075 - val_accuracy: 0.9710 Epoch 98/150 283/283 [==============================] - 48s 169ms/step - loss: 0.0579 - accuracy: 0.9821 - val_loss: 0.0937 - val_accuracy: 0.9757 Epoch 99/150 283/283 [==============================] - 49s 170ms/step - loss: 0.0393 - accuracy: 0.9858 - val_loss: 0.0910 - val_accuracy: 0.9787 Epoch 100/150 283/283 [==============================] - 48s 170ms/step - loss: 0.0459 - accuracy: 0.9852 - val_loss: 0.0753 - val_accuracy: 0.9840 Epoch 101/150 283/283 [==============================] - 48s 170ms/step - loss: 0.0820 - accuracy: 0.9755 - val_loss: 0.1075 - val_accuracy: 0.9750 Epoch 102/150 283/283 [==============================] - 49s 170ms/step - loss: 0.0452 - accuracy: 0.9877 - val_loss: 0.0922 - val_accuracy: 0.9763 Epoch 103/150 283/283 [==============================] - 48s 169ms/step - loss: 0.0585 - accuracy: 0.9798 - val_loss: 0.1073 - val_accuracy: 0.9713 Epoch 104/150 283/283 [==============================] - 49s 170ms/step - loss: 0.0591 - accuracy: 0.9811 - val_loss: 0.1308 - val_accuracy: 0.9657 Epoch 105/150 283/283 [==============================] - 48s 167ms/step - loss: 0.0823 - accuracy: 0.9751 - val_loss: 0.1054 - val_accuracy: 0.9763 Epoch 106/150 283/283 [==============================] - 49s 170ms/step - loss: 0.0413 - accuracy: 0.9863 - val_loss: 0.0891 - val_accuracy: 0.9767 Epoch 107/150 283/283 [==============================] - 48s 168ms/step - loss: 0.0480 - accuracy: 0.9844 - val_loss: 0.0964 - val_accuracy: 0.9793 Epoch 108/150 283/283 [==============================] - 49s 170ms/step - loss: 0.0680 - accuracy: 0.9792 - val_loss: 0.1512 - val_accuracy: 0.9617 Epoch 109/150 283/283 [==============================] - 49s 171ms/step - loss: 0.0611 - accuracy: 0.9817 - val_loss: 0.0663 - val_accuracy: 0.9833 Epoch 110/150 283/283 [==============================] - 49s 170ms/step - loss: 0.0416 - accuracy: 0.9864 - val_loss: 0.1113 - val_accuracy: 0.9750 Epoch 111/150 283/283 [==============================] - 48s 169ms/step - loss: 0.0486 - accuracy: 0.9854 - val_loss: 0.1975 - val_accuracy: 0.9517 Epoch 112/150 283/283 [==============================] - 50s 174ms/step - loss: 0.0886 - accuracy: 0.9722 - val_loss: 0.0933 - val_accuracy: 0.9750 Epoch 113/150 283/283 [==============================] - 48s 169ms/step - loss: 0.0423 - accuracy: 0.9867 - val_loss: 0.0714 - val_accuracy: 0.9797 Epoch 114/150 283/283 [==============================] - 49s 170ms/step - loss: 0.0570 - accuracy: 0.9822 - val_loss: 0.0959 - val_accuracy: 0.9750 Epoch 115/150 283/283 [==============================] - 48s 170ms/step - loss: 0.0528 - accuracy: 0.9847 - val_loss: 0.0688 - val_accuracy: 0.9823 Epoch 116/150 283/283 [==============================] - 48s 168ms/step - loss: 0.0354 - accuracy: 0.9887 - val_loss: 0.0891 - val_accuracy: 0.9773 Epoch 117/150 283/283 [==============================] - 48s 169ms/step - loss: 0.0858 - accuracy: 0.9730 - val_loss: 0.1173 - val_accuracy: 0.9683 Epoch 118/150 283/283 [==============================] - 48s 168ms/step - loss: 0.0475 - accuracy: 0.9857 - val_loss: 0.0828 - val_accuracy: 0.9813 Epoch 119/150 283/283 [==============================] - 49s 170ms/step - loss: 0.0361 - accuracy: 0.9870 - val_loss: 0.1099 - val_accuracy: 0.9687 Epoch 120/150 283/283 [==============================] - 49s 170ms/step - loss: 0.0400 - accuracy: 0.9883 - val_loss: 0.1261 - val_accuracy: 0.9663 Epoch 121/150 283/283 [==============================] - 48s 169ms/step - loss: 0.0568 - accuracy: 0.9807 - val_loss: 0.0760 - val_accuracy: 0.9830 Epoch 122/150 283/283 [==============================] - 49s 171ms/step - loss: 0.0506 - accuracy: 0.9846 - val_loss: 0.1194 - val_accuracy: 0.9717 Epoch 123/150 283/283 [==============================] - 48s 169ms/step - loss: 0.0507 - accuracy: 0.9840 - val_loss: 0.0665 - val_accuracy: 0.9840 Epoch 124/150 283/283 [==============================] - 49s 171ms/step - loss: 0.0429 - accuracy: 0.9878 - val_loss: 0.0820 - val_accuracy: 0.9820 Epoch 125/150 283/283 [==============================] - 48s 170ms/step - loss: 0.0210 - accuracy: 0.9925 - val_loss: 0.0724 - val_accuracy: 0.9833 Epoch 126/150 283/283 [==============================] - 48s 168ms/step - loss: 0.0496 - accuracy: 0.9845 - val_loss: 0.0931 - val_accuracy: 0.9753 Epoch 127/150 283/283 [==============================] - 49s 170ms/step - loss: 0.0582 - accuracy: 0.9842 - val_loss: 0.0891 - val_accuracy: 0.9750 Epoch 128/150 283/283 [==============================] - 49s 170ms/step - loss: 0.0486 - accuracy: 0.9855 - val_loss: 0.1856 - val_accuracy: 0.9627 Epoch 129/150 283/283 [==============================] - 49s 171ms/step - loss: 0.0380 - accuracy: 0.9875 - val_loss: 0.1100 - val_accuracy: 0.9723 Epoch 130/150 283/283 [==============================] - 49s 170ms/step - loss: 0.0386 - accuracy: 0.9896 - val_loss: 0.1057 - val_accuracy: 0.9700 Epoch 131/150 283/283 [==============================] - 48s 169ms/step - loss: 0.0577 - accuracy: 0.9829 - val_loss: 0.1242 - val_accuracy: 0.9713 Epoch 132/150 283/283 [==============================] - 49s 170ms/step - loss: 0.0446 - accuracy: 0.9868 - val_loss: 0.0760 - val_accuracy: 0.9820 Epoch 133/150 283/283 [==============================] - 49s 170ms/step - loss: 0.0368 - accuracy: 0.9880 - val_loss: 0.0924 - val_accuracy: 0.9807 Epoch 134/150 283/283 [==============================] - 49s 171ms/step - loss: 0.0438 - accuracy: 0.9852 - val_loss: 0.0673 - val_accuracy: 0.9833 Epoch 135/150 283/283 [==============================] - 49s 171ms/step - loss: 0.0484 - accuracy: 0.9845 - val_loss: 0.0857 - val_accuracy: 0.9803 Epoch 136/150 283/283 [==============================] - 49s 171ms/step - loss: 0.0457 - accuracy: 0.9877 - val_loss: 0.1088 - val_accuracy: 0.9723 Epoch 137/150 283/283 [==============================] - 48s 170ms/step - loss: 0.0333 - accuracy: 0.9897 - val_loss: 0.0834 - val_accuracy: 0.9737 Epoch 138/150 283/283 [==============================] - 49s 171ms/step - loss: 0.0489 - accuracy: 0.9842 - val_loss: 0.1568 - val_accuracy: 0.9573 Epoch 139/150 283/283 [==============================] - 48s 170ms/step - loss: 0.0349 - accuracy: 0.9886 - val_loss: 0.1532 - val_accuracy: 0.9620
improvement128_model_list = [res128_vggmodel, res128_final_cw, res128_final_aug, res128_final_cw_aug]
improvement128_data_list = [std_res128_validation, std_res128_validation, std_res128_validation, std_res128_validation]
name_list = ["VGG Model with no improvements", "Class Weights Applied", "Augmentation Applied", "Class Weights and Augmentation Applied"]
cnnf.model_comparison(improvement128_model_list, improvement128_data_list, name_list)
94/94 [==============================] - 5s 42ms/step - loss: 0.1726 - accuracy: 0.9653 94/94 [==============================] - 5s 45ms/step - loss: 0.2479 - accuracy: 0.9527 94/94 [==============================] - 4s 41ms/step - loss: 0.0823 - accuracy: 0.9790 94/94 [==============================] - 4s 41ms/step - loss: 0.1532 - accuracy: 0.9620
From the above, we can see that all of the improvements that we attempted have had a improvement on the model's performance, with the application of Augmentation only having the most profound improvement, with accuracy and loss both being the best performing among all the 4 permutations. Hence, we shall now select this model and hypertune it. One interesting observation is that appling class weights actually causes the model to perform worse, with a 43% drop in loss performance.
From this, we can conclude that augmentation is useful for the 128 x 128 image input size.
First, we shall save the model that we want to hypertune, so that we can hypertune seperately, saving on computational resources.
res128_final_aug.save('res128_best.h5')
Now that we have completed the larger steps in model improvement, we shall now take the smaller, final step to allow the model to reach it's peak performance. We shall do this by hypertuning the parameters of the model. For our case, we shall only tune the optimizer, as the loss function is already very well suited for our use case. We shall tune the models using the Adam, SGD, and RMSProp optimizers.
Stochastic Gradient Descent (SGD) is one of the most basic and widely used optimization algorithms in machine learning and deep learning. It's a variant of gradient descent where instead of performing computations on the whole dataset – which can be computationally intensive for large datasets – SGD updates the model's weights using only a single or a few samples at a time. This makes the algorithm much faster and more suitable for large datasets. The formula for SGD is as follows. $$ w_{t+1} = w_t - \eta \cdot \nabla L(w_t, x_i, y_i) $$
\begin{aligned} \text{where} \\ w_{t+1} &\text{ is the updated weight vector at time } t+1, \\ w_t &\text{ is the weight vector at time } t, \\ \eta &\text{ is the learning rate, and} \\ \nabla L(w_t, x_i, y_i) &\text{ is the gradient of the loss function } L \text{ with respect to the weights } w, \text{ evaluated at a randomly chosen data point } (x_i, y_i). \end{aligned}RMSProp, short for Root Mean Square Propagation, is an adaptive learning rate method proposed by Geoffrey Hinton. It addresses some of the limitations of SGD, especially in the context of minimizing functions in very high-dimensional spaces. RMSProp adjusts the learning rate for each weight based on the recent magnitudes of the gradients for that weight. This means that the learning rate is reduced for weights that consistently receive high gradients, which helps in faster convergence especially in situations involving oscillations. The formula for RMSProp is as follows.
Despite Adam and RMSProp being both able to adjust their learning rates, the values which they start at are still quite important to how the model performs. Hence, we shall do 3 permutations for each optimizer, and see how they improve from there.
earlyStopper = EarlyStopping(monitor = 'val_loss', patience = 15)
def model_hypertuner(optimizer_list, name_list, model_list):
for optimizer, name, model in zip(optimizer_list, name_list, model_list):
clear_output(wait=True)
print(f"Now attempting to tune {name}")
model.compile(optimizer = optimizer, loss = "sparse_categorical_crossentropy", metrics=['accuracy'])
model.fit(aug_res31_train, validation_data = std_res31_validation, epochs = 75, callbacks = earlyStopper, class_weight = class_weights)
gc.collect()
from tensorflow.keras.optimizers import Adam
from tensorflow.keras.optimizers import SGD
from tensorflow.keras.optimizers import RMSprop
optimizer_list = [
Adam(),
Adam(learning_rate=0.005),
Adam(learning_rate=0.0002),
SGD(learning_rate=0.005),
SGD(learning_rate=0.05),
SGD(),
RMSprop(learning_rate=0.05),
RMSprop(learning_rate=0.002),
RMSprop(),
]
name_list = [
'Adam_LR_0.001',
'Adam_LR_0.005',
'Adam_LR_0.0002',
'SGD_LR_0.005',
'SGD_LR_0.05',
'SGD_LR_0.01',
'RMSprop_LR_0.05',
'RMSprop_LR_0.002',
'RMSprop_LR_0.01'
]
tuned_31_Adam_LR_0_001 = load_model('res31_best.h5')
tuned_31_Adam_LR_0_005 = load_model('res31_best.h5')
tuned_31_Adam_LR_0_0002 = load_model('res31_best.h5')
tuned_31_SGD_LR_0_005 = load_model('res31_best.h5')
tuned_31_SGD_LR_0_05 = load_model('res31_best.h5')
tuned_31_SGD_LR_0_01 = load_model('res31_best.h5')
tuned_31_RMSprop_LR_0_05 = load_model('res31_best.h5')
tuned_31_RMSprop_LR_0_002 = load_model('res31_best.h5')
tuned_31_RMSprop_LR_0_01 = load_model('res31_best.h5')
output_list = [tuned_31_Adam_LR_0_001, tuned_31_Adam_LR_0_005, tuned_31_Adam_LR_0_0002, tuned_31_SGD_LR_0_005, tuned_31_SGD_LR_0_05, tuned_31_SGD_LR_0_01, tuned_31_RMSprop_LR_0_05, tuned_31_RMSprop_LR_0_002, tuned_31_RMSprop_LR_0_01]
data_list = [std_res31_validation, std_res31_validation, std_res31_validation, std_res31_validation, std_res31_validation, std_res31_validation, std_res31_validation, std_res31_validation, std_res31_validation]
model_hypertuner(optimizer_list, name_list, output_list)
cnnf.model_comparison(output_list, data_list, name_list)
Now attempting to tune RMSprop_LR_0.01 Epoch 1/75 283/283 [==============================] - 15s 43ms/step - loss: 0.3753 - accuracy: 0.8856 - val_loss: 1.4580 - val_accuracy: 0.6820 Epoch 2/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2982 - accuracy: 0.9042 - val_loss: 1.0157 - val_accuracy: 0.7587 Epoch 3/75 283/283 [==============================] - 12s 43ms/step - loss: 0.2779 - accuracy: 0.9160 - val_loss: 1.1390 - val_accuracy: 0.7510 Epoch 4/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2681 - accuracy: 0.9205 - val_loss: 1.0231 - val_accuracy: 0.7717 Epoch 5/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2714 - accuracy: 0.9167 - val_loss: 0.6141 - val_accuracy: 0.8437 Epoch 6/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2758 - accuracy: 0.9198 - val_loss: 0.9498 - val_accuracy: 0.7987 Epoch 7/75 283/283 [==============================] - 12s 43ms/step - loss: 0.2674 - accuracy: 0.9187 - val_loss: 0.9154 - val_accuracy: 0.7980 Epoch 8/75 283/283 [==============================] - 12s 43ms/step - loss: 0.2611 - accuracy: 0.9220 - val_loss: 0.7599 - val_accuracy: 0.8590 Epoch 9/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2532 - accuracy: 0.9243 - val_loss: 0.8581 - val_accuracy: 0.8103 Epoch 10/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2465 - accuracy: 0.9270 - val_loss: 0.7018 - val_accuracy: 0.8480 Epoch 11/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2580 - accuracy: 0.9241 - val_loss: 0.6722 - val_accuracy: 0.8483 Epoch 12/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2386 - accuracy: 0.9294 - val_loss: 0.5131 - val_accuracy: 0.8807 Epoch 13/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2631 - accuracy: 0.9278 - val_loss: 0.6090 - val_accuracy: 0.8503 Epoch 14/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2611 - accuracy: 0.9279 - val_loss: 0.7857 - val_accuracy: 0.8163 Epoch 15/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2399 - accuracy: 0.9300 - val_loss: 0.5388 - val_accuracy: 0.8897 Epoch 16/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2230 - accuracy: 0.9330 - val_loss: 0.7800 - val_accuracy: 0.8423 Epoch 17/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2522 - accuracy: 0.9303 - val_loss: 0.5498 - val_accuracy: 0.8553 Epoch 18/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2564 - accuracy: 0.9255 - val_loss: 0.5863 - val_accuracy: 0.8690 Epoch 19/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2259 - accuracy: 0.9329 - val_loss: 0.5460 - val_accuracy: 0.8667 Epoch 20/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2411 - accuracy: 0.9288 - val_loss: 0.5891 - val_accuracy: 0.8613 Epoch 21/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2255 - accuracy: 0.9323 - val_loss: 0.8443 - val_accuracy: 0.7940 Epoch 22/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2379 - accuracy: 0.9308 - val_loss: 0.3707 - val_accuracy: 0.9207 Epoch 23/75 283/283 [==============================] - 12s 43ms/step - loss: 0.2248 - accuracy: 0.9370 - val_loss: 0.5791 - val_accuracy: 0.8963 Epoch 24/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2338 - accuracy: 0.9329 - val_loss: 0.5993 - val_accuracy: 0.8737 Epoch 25/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2336 - accuracy: 0.9327 - val_loss: 0.6589 - val_accuracy: 0.8590 Epoch 26/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2347 - accuracy: 0.9363 - val_loss: 0.6520 - val_accuracy: 0.8697 Epoch 27/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2266 - accuracy: 0.9318 - val_loss: 1.0028 - val_accuracy: 0.8110 Epoch 28/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2390 - accuracy: 0.9323 - val_loss: 0.7635 - val_accuracy: 0.8417 Epoch 29/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2141 - accuracy: 0.9368 - val_loss: 0.6427 - val_accuracy: 0.8703 Epoch 30/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2205 - accuracy: 0.9399 - val_loss: 0.9982 - val_accuracy: 0.8330 Epoch 31/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2116 - accuracy: 0.9390 - val_loss: 0.4486 - val_accuracy: 0.9077 Epoch 32/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2273 - accuracy: 0.9362 - val_loss: 0.5304 - val_accuracy: 0.8970 Epoch 33/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2276 - accuracy: 0.9364 - val_loss: 0.3944 - val_accuracy: 0.9010 Epoch 34/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2420 - accuracy: 0.9361 - val_loss: 0.6279 - val_accuracy: 0.8727 Epoch 35/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2139 - accuracy: 0.9418 - val_loss: 0.3579 - val_accuracy: 0.9113 Epoch 36/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2133 - accuracy: 0.9399 - val_loss: 0.5748 - val_accuracy: 0.8863 Epoch 37/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2149 - accuracy: 0.9413 - val_loss: 0.6420 - val_accuracy: 0.8863 Epoch 38/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2268 - accuracy: 0.9403 - val_loss: 0.5430 - val_accuracy: 0.8957 Epoch 39/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2043 - accuracy: 0.9446 - val_loss: 0.3671 - val_accuracy: 0.9177 Epoch 40/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2241 - accuracy: 0.9426 - val_loss: 0.4018 - val_accuracy: 0.9143 Epoch 41/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2272 - accuracy: 0.9403 - val_loss: 1.0013 - val_accuracy: 0.7870 Epoch 42/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2155 - accuracy: 0.9407 - val_loss: 0.5323 - val_accuracy: 0.8800 Epoch 43/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2149 - accuracy: 0.9423 - val_loss: 0.3520 - val_accuracy: 0.9277 Epoch 44/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2213 - accuracy: 0.9413 - val_loss: 0.7185 - val_accuracy: 0.8837 Epoch 45/75 283/283 [==============================] - 12s 43ms/step - loss: 0.2394 - accuracy: 0.9394 - val_loss: 1.1354 - val_accuracy: 0.8167 Epoch 46/75 283/283 [==============================] - 13s 45ms/step - loss: 0.2061 - accuracy: 0.9442 - val_loss: 0.6748 - val_accuracy: 0.8583 Epoch 47/75 283/283 [==============================] - 14s 47ms/step - loss: 0.2180 - accuracy: 0.9432 - val_loss: 0.5281 - val_accuracy: 0.8820 Epoch 48/75 283/283 [==============================] - 14s 47ms/step - loss: 0.2106 - accuracy: 0.9425 - val_loss: 0.5566 - val_accuracy: 0.8950 Epoch 49/75 283/283 [==============================] - 14s 47ms/step - loss: 0.2249 - accuracy: 0.9415 - val_loss: 0.8661 - val_accuracy: 0.8370 Epoch 50/75 283/283 [==============================] - 14s 47ms/step - loss: 0.2333 - accuracy: 0.9442 - val_loss: 0.5835 - val_accuracy: 0.8740 Epoch 51/75 283/283 [==============================] - 14s 47ms/step - loss: 0.2092 - accuracy: 0.9461 - val_loss: 0.5614 - val_accuracy: 0.8753 Epoch 52/75 283/283 [==============================] - 14s 47ms/step - loss: 0.2207 - accuracy: 0.9418 - val_loss: 0.4740 - val_accuracy: 0.8930 Epoch 53/75 283/283 [==============================] - 14s 47ms/step - loss: 0.2077 - accuracy: 0.9423 - val_loss: 1.2104 - val_accuracy: 0.7727 Epoch 54/75 283/283 [==============================] - 14s 48ms/step - loss: 0.2226 - accuracy: 0.9459 - val_loss: 0.4695 - val_accuracy: 0.9160 Epoch 55/75 283/283 [==============================] - 13s 44ms/step - loss: 0.2062 - accuracy: 0.9461 - val_loss: 0.5044 - val_accuracy: 0.8810 Epoch 56/75 283/283 [==============================] - 12s 42ms/step - loss: 0.1865 - accuracy: 0.9474 - val_loss: 0.6067 - val_accuracy: 0.8947 Epoch 57/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2100 - accuracy: 0.9442 - val_loss: 0.7134 - val_accuracy: 0.8640 Epoch 58/75 283/283 [==============================] - 12s 42ms/step - loss: 0.2129 - accuracy: 0.9432 - val_loss: 0.4775 - val_accuracy: 0.8913 94/94 [==============================] - 1s 10ms/step - loss: 0.3305 - accuracy: 0.9190 94/94 [==============================] - 1s 10ms/step - loss: 0.7807 - accuracy: 0.8057 94/94 [==============================] - 1s 10ms/step - loss: 0.2485 - accuracy: 0.9403 94/94 [==============================] - 1s 11ms/step - loss: 0.6039 - accuracy: 0.8300 94/94 [==============================] - 1s 10ms/step - loss: 0.4254 - accuracy: 0.8773 94/94 [==============================] - 1s 10ms/step - loss: 0.4747 - accuracy: 0.8670 94/94 [==============================] - 1s 10ms/step - loss: 2043853066469376.0000 - accuracy: 0.0693 94/94 [==============================] - 1s 10ms/step - loss: 0.9589 - accuracy: 0.7867 94/94 [==============================] - 1s 10ms/step - loss: 0.4775 - accuracy: 0.8913
From the above, we can see that Adam is still the best optimizer for the model, and especially when it has a learning rate of 0.0002. We shall save this model again, so that we do not lose our progress, and run it using the test dataset to see it's final performance. Before we evaluate the model's performance on the test dataset, we shall allow the model to train for another 50 epochs, allow for it to perform to the best of it's ability.
history = tuned_31_Adam_LR_0_0002.fit(aug_res31_train, validation_data = std_res31_validation, epochs = 50, class_weight = class_weights)
cnnf.model_metric_graph(history, tuned_31_Adam_LR_0_0002, std_res128_validation, name = "Extended run for final 128 x 128", class_names = class_names)
Epoch 1/50 283/283 [==============================] - 11s 38ms/step - loss: 0.0639 - accuracy: 0.9809 - val_loss: 0.2376 - val_accuracy: 0.9450 Epoch 2/50 283/283 [==============================] - 11s 38ms/step - loss: 0.0663 - accuracy: 0.9772 - val_loss: 0.2423 - val_accuracy: 0.9427 Epoch 3/50 283/283 [==============================] - 11s 38ms/step - loss: 0.0673 - accuracy: 0.9774 - val_loss: 0.2088 - val_accuracy: 0.9467 Epoch 4/50 283/283 [==============================] - 11s 37ms/step - loss: 0.0540 - accuracy: 0.9822 - val_loss: 0.2266 - val_accuracy: 0.9460 Epoch 5/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0585 - accuracy: 0.9790 - val_loss: 0.2071 - val_accuracy: 0.9500 Epoch 6/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0550 - accuracy: 0.9818 - val_loss: 0.2033 - val_accuracy: 0.9507 Epoch 7/50 283/283 [==============================] - 10s 35ms/step - loss: 0.0519 - accuracy: 0.9822 - val_loss: 0.1963 - val_accuracy: 0.9553 Epoch 8/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0533 - accuracy: 0.9839 - val_loss: 0.2297 - val_accuracy: 0.9463 Epoch 9/50 283/283 [==============================] - 10s 35ms/step - loss: 0.0554 - accuracy: 0.9834 - val_loss: 0.2444 - val_accuracy: 0.9427 Epoch 10/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0520 - accuracy: 0.9842 - val_loss: 0.2245 - val_accuracy: 0.9470 Epoch 11/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0508 - accuracy: 0.9826 - val_loss: 0.2667 - val_accuracy: 0.9373 Epoch 12/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0513 - accuracy: 0.9827 - val_loss: 0.2020 - val_accuracy: 0.9563 Epoch 13/50 283/283 [==============================] - 11s 36ms/step - loss: 0.0507 - accuracy: 0.9822 - val_loss: 0.2094 - val_accuracy: 0.9500 Epoch 14/50 283/283 [==============================] - 10s 35ms/step - loss: 0.0512 - accuracy: 0.9829 - val_loss: 0.2272 - val_accuracy: 0.9477 Epoch 15/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0523 - accuracy: 0.9831 - val_loss: 0.2203 - val_accuracy: 0.9517 Epoch 16/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0542 - accuracy: 0.9821 - val_loss: 0.2630 - val_accuracy: 0.9360 Epoch 17/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0539 - accuracy: 0.9816 - val_loss: 0.2405 - val_accuracy: 0.9443 Epoch 18/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0470 - accuracy: 0.9840 - val_loss: 0.2284 - val_accuracy: 0.9533 Epoch 19/50 283/283 [==============================] - 10s 35ms/step - loss: 0.0561 - accuracy: 0.9821 - val_loss: 0.2367 - val_accuracy: 0.9473 Epoch 20/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0461 - accuracy: 0.9854 - val_loss: 0.3225 - val_accuracy: 0.9257 Epoch 21/50 283/283 [==============================] - 10s 35ms/step - loss: 0.0510 - accuracy: 0.9837 - val_loss: 0.2348 - val_accuracy: 0.9467 Epoch 22/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0630 - accuracy: 0.9798 - val_loss: 0.2272 - val_accuracy: 0.9457 Epoch 23/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0542 - accuracy: 0.9815 - val_loss: 0.2603 - val_accuracy: 0.9417 Epoch 24/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0585 - accuracy: 0.9793 - val_loss: 0.2013 - val_accuracy: 0.9523 Epoch 25/50 283/283 [==============================] - 11s 36ms/step - loss: 0.0472 - accuracy: 0.9831 - val_loss: 0.2163 - val_accuracy: 0.9483 Epoch 26/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0459 - accuracy: 0.9844 - val_loss: 0.2173 - val_accuracy: 0.9527 Epoch 27/50 283/283 [==============================] - 11s 36ms/step - loss: 0.0508 - accuracy: 0.9823 - val_loss: 0.2040 - val_accuracy: 0.9530 Epoch 28/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0538 - accuracy: 0.9834 - val_loss: 0.2195 - val_accuracy: 0.9487 Epoch 29/50 283/283 [==============================] - 11s 36ms/step - loss: 0.0526 - accuracy: 0.9808 - val_loss: 0.2431 - val_accuracy: 0.9433 Epoch 30/50 283/283 [==============================] - 11s 37ms/step - loss: 0.0413 - accuracy: 0.9857 - val_loss: 0.2143 - val_accuracy: 0.9497 Epoch 31/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0509 - accuracy: 0.9814 - val_loss: 0.2019 - val_accuracy: 0.9500 Epoch 32/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0506 - accuracy: 0.9840 - val_loss: 0.1964 - val_accuracy: 0.9520 Epoch 33/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0533 - accuracy: 0.9825 - val_loss: 0.1957 - val_accuracy: 0.9557 Epoch 34/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0414 - accuracy: 0.9856 - val_loss: 0.2283 - val_accuracy: 0.9443 Epoch 35/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0493 - accuracy: 0.9816 - val_loss: 0.2162 - val_accuracy: 0.9477 Epoch 36/50 283/283 [==============================] - 11s 36ms/step - loss: 0.0452 - accuracy: 0.9856 - val_loss: 0.2443 - val_accuracy: 0.9457 Epoch 37/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0498 - accuracy: 0.9825 - val_loss: 0.2377 - val_accuracy: 0.9470 Epoch 38/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0474 - accuracy: 0.9834 - val_loss: 0.2308 - val_accuracy: 0.9470 Epoch 39/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0477 - accuracy: 0.9857 - val_loss: 0.1860 - val_accuracy: 0.9577 Epoch 40/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0547 - accuracy: 0.9844 - val_loss: 0.2185 - val_accuracy: 0.9453 Epoch 41/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0485 - accuracy: 0.9839 - val_loss: 0.2106 - val_accuracy: 0.9510 Epoch 42/50 283/283 [==============================] - 11s 36ms/step - loss: 0.0461 - accuracy: 0.9850 - val_loss: 0.2417 - val_accuracy: 0.9437 Epoch 43/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0387 - accuracy: 0.9875 - val_loss: 0.2355 - val_accuracy: 0.9463 Epoch 44/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0463 - accuracy: 0.9850 - val_loss: 0.2385 - val_accuracy: 0.9490 Epoch 45/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0407 - accuracy: 0.9872 - val_loss: 0.2149 - val_accuracy: 0.9517 Epoch 46/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0456 - accuracy: 0.9847 - val_loss: 0.2272 - val_accuracy: 0.9503 Epoch 47/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0421 - accuracy: 0.9852 - val_loss: 0.2093 - val_accuracy: 0.9527 Epoch 48/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0432 - accuracy: 0.9857 - val_loss: 0.2266 - val_accuracy: 0.9423 Epoch 49/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0451 - accuracy: 0.9846 - val_loss: 0.2115 - val_accuracy: 0.9500 Epoch 50/50 283/283 [==============================] - 10s 36ms/step - loss: 0.0449 - accuracy: 0.9827 - val_loss: 0.2080 - val_accuracy: 0.9507
tuned_31_Adam_LR_0_0002.save("res31_best_tuned.h5")
tuned_31_Adam_LR_0_0002.save_weights("res31_best_tuned_weights.h5")
from sklearn.metrics import accuracy_score
from tensorflow.keras.losses import SparseCategoricalCrossentropy
def predict_in_batches(model, dataset, batch_size=32):
predictions = []
true_labels = []
for batch_data, labels in dataset.as_numpy_iterator():
batch_predictions = model.predict(batch_data)
clear_output(wait=True)
predictions.append(batch_predictions)
true_labels.append(labels)
return np.concatenate(predictions, axis=0), np.concatenate(true_labels, axis = 0)
predictions, true_labels = predict_in_batches(tuned_31_Adam_LR_0_0002, std_res31_test, batch_size=32)
scce = SparseCategoricalCrossentropy()
fig = plt.figure(figsize = (20,10))
sns.heatmap(confusion_matrix(true_labels, np.argmax(predictions, axis = 1)), annot = True, xticklabels = class_names, yticklabels = class_names, fmt = "g")
plt.suptitle(f"Final Predictions of model for 31 x 31 images.\n\nAccuracy: {accuracy_score(true_labels, np.argmax(predictions, axis = 1)):.4f}, Loss: {scce(true_labels, predictions).numpy():.4f}")
plt.xlabel('Predicted')
plt.ylabel('True')
plt.show()
We can see that the final model performs quite well, with a accuracy of 94%. We can see that for the most part, the model seems to be able to predict most vegetables correctly, except for Potatoes, which it seems to think are brinjals. Apart from that, the model makes a few mistakes here and there, but is still able to capture most of the detail in the dataset.
def model_hypertuner_128(optimizer_list, name_list, model_list):
for optimizer, name, model in zip(optimizer_list, name_list, model_list):
clear_output(wait=True)
print(f"Now attempting to tune {name}")
model.compile(optimizer = optimizer, loss = "sparse_categorical_crossentropy", metrics=['accuracy'])
model.fit(aug_res128_train, validation_data = std_res128_validation, epochs = 75, callbacks = earlyStopper, class_weight = class_weights)
gc.collect()
from tensorflow.keras.optimizers import Adam
from tensorflow.keras.optimizers import SGD
from tensorflow.keras.optimizers import RMSprop
optimizer_list = [
Adam(),
Adam(learning_rate=0.005),
Adam(learning_rate=0.0002),
SGD(learning_rate=0.005),
SGD(learning_rate=0.05),
SGD(),
RMSprop(learning_rate=0.05),
RMSprop(learning_rate=0.002),
RMSprop(),
]
name_list = [
'Adam_LR_0.001',
'Adam_LR_0.005',
'Adam_LR_0.0002',
'SGD_LR_0.005',
'SGD_LR_0.05',
'SGD_LR_0.01',
'RMSprop_LR_0.05',
'RMSprop_LR_0.002',
'RMSprop_LR_0.01'
]
tuned_128_Adam_LR_0_001 = load_model('res128_best.h5')
tuned_128_Adam_LR_0_005 = load_model('res128_best.h5')
tuned_128_Adam_LR_0_0002 = load_model('res128_best.h5')
tuned_128_SGD_LR_0_005 = load_model('res128_best.h5')
tuned_128_SGD_LR_0_05 = load_model('res128_best.h5')
tuned_128_SGD_LR_0_01 = load_model('res128_best.h5')
tuned_128_RMSprop_LR_0_05 = load_model('res128_best.h5')
tuned_128_RMSprop_LR_0_002 = load_model('res128_best.h5')
tuned_128_RMSprop_LR_0_01 = load_model('res128_best.h5')
output_list = [tuned_128_Adam_LR_0_001, tuned_128_Adam_LR_0_005, tuned_128_Adam_LR_0_0002, tuned_128_SGD_LR_0_005, tuned_128_SGD_LR_0_05, tuned_128_SGD_LR_0_01, tuned_128_RMSprop_LR_0_05, tuned_128_RMSprop_LR_0_002, tuned_128_RMSprop_LR_0_01]
data_list = [std_res128_validation, std_res128_validation, std_res128_validation, std_res128_validation, std_res128_validation, std_res128_validation, std_res128_validation, std_res128_validation, std_res128_validation]
model_hypertuner_128(optimizer_list, name_list, output_list)
cnnf.model_comparison(output_list, data_list, name_list)
Now attempting to tune RMSprop_LR_0.01 Epoch 1/75 283/283 [==============================] - 65s 218ms/step - loss: 0.0714 - accuracy: 0.9813 - val_loss: 0.2232 - val_accuracy: 0.9563 Epoch 2/75 283/283 [==============================] - 61s 213ms/step - loss: 0.0524 - accuracy: 0.9854 - val_loss: 0.1875 - val_accuracy: 0.9547 Epoch 3/75 283/283 [==============================] - 62s 217ms/step - loss: 0.0467 - accuracy: 0.9872 - val_loss: 0.1648 - val_accuracy: 0.9663 Epoch 4/75 283/283 [==============================] - 61s 215ms/step - loss: 0.0553 - accuracy: 0.9826 - val_loss: 0.1532 - val_accuracy: 0.9647 Epoch 5/75 283/283 [==============================] - 61s 213ms/step - loss: 0.0464 - accuracy: 0.9868 - val_loss: 0.1049 - val_accuracy: 0.9787 Epoch 6/75 283/283 [==============================] - 63s 220ms/step - loss: 0.0479 - accuracy: 0.9873 - val_loss: 0.1535 - val_accuracy: 0.9723 Epoch 7/75 283/283 [==============================] - 63s 220ms/step - loss: 0.0582 - accuracy: 0.9838 - val_loss: 0.1851 - val_accuracy: 0.9670 Epoch 8/75 283/283 [==============================] - 63s 221ms/step - loss: 0.0494 - accuracy: 0.9875 - val_loss: 0.1479 - val_accuracy: 0.9697 Epoch 9/75 283/283 [==============================] - 62s 218ms/step - loss: 0.0534 - accuracy: 0.9858 - val_loss: 0.4222 - val_accuracy: 0.9210 Epoch 10/75 283/283 [==============================] - 62s 220ms/step - loss: 0.0553 - accuracy: 0.9849 - val_loss: 0.1170 - val_accuracy: 0.9757 Epoch 11/75 283/283 [==============================] - 61s 216ms/step - loss: 0.0586 - accuracy: 0.9858 - val_loss: 0.1213 - val_accuracy: 0.9727 Epoch 12/75 283/283 [==============================] - 62s 217ms/step - loss: 0.0546 - accuracy: 0.9856 - val_loss: 0.1213 - val_accuracy: 0.9700 Epoch 13/75 283/283 [==============================] - 61s 215ms/step - loss: 0.0450 - accuracy: 0.9867 - val_loss: 0.2045 - val_accuracy: 0.9553 Epoch 14/75 283/283 [==============================] - 61s 213ms/step - loss: 0.0563 - accuracy: 0.9850 - val_loss: 0.5783 - val_accuracy: 0.8970 Epoch 15/75 283/283 [==============================] - 62s 218ms/step - loss: 0.0450 - accuracy: 0.9881 - val_loss: 0.1751 - val_accuracy: 0.9643 Epoch 16/75 283/283 [==============================] - 61s 214ms/step - loss: 0.0455 - accuracy: 0.9873 - val_loss: 0.3375 - val_accuracy: 0.9323 Epoch 17/75 283/283 [==============================] - 61s 213ms/step - loss: 0.0534 - accuracy: 0.9866 - val_loss: 0.1415 - val_accuracy: 0.9757 Epoch 18/75 283/283 [==============================] - 62s 219ms/step - loss: 0.0466 - accuracy: 0.9889 - val_loss: 0.2337 - val_accuracy: 0.9607 Epoch 19/75 283/283 [==============================] - 62s 217ms/step - loss: 0.0517 - accuracy: 0.9854 - val_loss: 0.1231 - val_accuracy: 0.9743 Epoch 20/75 283/283 [==============================] - 63s 222ms/step - loss: 0.0504 - accuracy: 0.9875 - val_loss: 0.2232 - val_accuracy: 0.9530 94/94 [==============================] - 5s 45ms/step - loss: 0.0880 - accuracy: 0.9770 94/94 [==============================] - 5s 47ms/step - loss: 0.1283 - accuracy: 0.9677 94/94 [==============================] - 5s 48ms/step - loss: 0.0407 - accuracy: 0.9910 94/94 [==============================] - 5s 47ms/step - loss: 0.0832 - accuracy: 0.9823 94/94 [==============================] - 5s 45ms/step - loss: 0.0718 - accuracy: 0.9823 94/94 [==============================] - 5s 47ms/step - loss: 0.0588 - accuracy: 0.9853 94/94 [==============================] - 5s 48ms/step - loss: 63049.3828 - accuracy: 0.0667 94/94 [==============================] - 5s 47ms/step - loss: 0.8063 - accuracy: 0.8670 94/94 [==============================] - 5s 47ms/step - loss: 0.2232 - accuracy: 0.9530
From the above, we can see that Adam is still the best optimizer for the model, and especially when it has a learning rate of 0.0002. We shall save this model again, so that we do not lose our progress, and run it using the test dataset to see it's final performance. Before we evaluate the model's performance on the test dataset, we shall allow the model to train for another 50 epochs, allow for it to perform to the best of it's ability.
history = tuned_128_Adam_LR_0_0002.fit(aug_res128_train, validation_data = std_res128_validation, epochs = 50, class_weight = class_weights)
cnnf.model_metric_graph(history, tuned_128_Adam_LR_0_0002, std_res128_validation, name = "Extended run for final 128 x 128", class_names = class_names)
Epoch 1/50 283/283 [==============================] - 62s 217ms/step - loss: 0.0115 - accuracy: 0.9967 - val_loss: 0.0422 - val_accuracy: 0.9917 Epoch 2/50 283/283 [==============================] - 59s 208ms/step - loss: 0.0137 - accuracy: 0.9961 - val_loss: 0.0468 - val_accuracy: 0.9897 Epoch 3/50 283/283 [==============================] - 61s 216ms/step - loss: 0.0122 - accuracy: 0.9967 - val_loss: 0.0477 - val_accuracy: 0.9900 Epoch 4/50 283/283 [==============================] - 61s 216ms/step - loss: 0.0094 - accuracy: 0.9977 - val_loss: 0.0389 - val_accuracy: 0.9913 Epoch 5/50 283/283 [==============================] - 62s 216ms/step - loss: 0.0088 - accuracy: 0.9979 - val_loss: 0.0474 - val_accuracy: 0.9893 Epoch 6/50 283/283 [==============================] - 61s 216ms/step - loss: 0.0113 - accuracy: 0.9975 - val_loss: 0.0557 - val_accuracy: 0.9897 Epoch 7/50 283/283 [==============================] - 62s 217ms/step - loss: 0.0117 - accuracy: 0.9969 - val_loss: 0.0529 - val_accuracy: 0.9897 Epoch 8/50 283/283 [==============================] - 62s 217ms/step - loss: 0.0196 - accuracy: 0.9946 - val_loss: 0.0463 - val_accuracy: 0.9890 Epoch 9/50 283/283 [==============================] - 62s 216ms/step - loss: 0.0103 - accuracy: 0.9967 - val_loss: 0.0430 - val_accuracy: 0.9900 Epoch 10/50 283/283 [==============================] - 62s 217ms/step - loss: 0.0111 - accuracy: 0.9972 - val_loss: 0.0455 - val_accuracy: 0.9907 Epoch 11/50 283/283 [==============================] - 62s 216ms/step - loss: 0.0098 - accuracy: 0.9973 - val_loss: 0.0455 - val_accuracy: 0.9897 Epoch 12/50 283/283 [==============================] - 60s 210ms/step - loss: 0.0127 - accuracy: 0.9960 - val_loss: 0.0376 - val_accuracy: 0.9920 Epoch 13/50 283/283 [==============================] - 61s 215ms/step - loss: 0.0133 - accuracy: 0.9966 - val_loss: 0.0340 - val_accuracy: 0.9917 Epoch 14/50 283/283 [==============================] - 62s 217ms/step - loss: 0.0123 - accuracy: 0.9965 - val_loss: 0.0455 - val_accuracy: 0.9907 Epoch 15/50 283/283 [==============================] - 62s 217ms/step - loss: 0.0098 - accuracy: 0.9973 - val_loss: 0.0437 - val_accuracy: 0.9907 Epoch 16/50 283/283 [==============================] - 62s 216ms/step - loss: 0.0092 - accuracy: 0.9972 - val_loss: 0.0388 - val_accuracy: 0.9913 Epoch 17/50 283/283 [==============================] - 62s 218ms/step - loss: 0.0123 - accuracy: 0.9969 - val_loss: 0.0365 - val_accuracy: 0.9907 Epoch 18/50 283/283 [==============================] - 62s 218ms/step - loss: 0.0093 - accuracy: 0.9969 - val_loss: 0.0433 - val_accuracy: 0.9923 Epoch 19/50 283/283 [==============================] - 62s 218ms/step - loss: 0.0117 - accuracy: 0.9966 - val_loss: 0.0376 - val_accuracy: 0.9927 Epoch 20/50 283/283 [==============================] - 62s 218ms/step - loss: 0.0109 - accuracy: 0.9969 - val_loss: 0.0487 - val_accuracy: 0.9900 Epoch 21/50 283/283 [==============================] - 62s 219ms/step - loss: 0.0139 - accuracy: 0.9967 - val_loss: 0.0342 - val_accuracy: 0.9920 Epoch 22/50 283/283 [==============================] - 61s 216ms/step - loss: 0.0130 - accuracy: 0.9962 - val_loss: 0.0337 - val_accuracy: 0.9920 Epoch 23/50 283/283 [==============================] - 61s 215ms/step - loss: 0.0153 - accuracy: 0.9953 - val_loss: 0.0342 - val_accuracy: 0.9917 Epoch 24/50 283/283 [==============================] - 61s 215ms/step - loss: 0.0109 - accuracy: 0.9979 - val_loss: 0.0302 - val_accuracy: 0.9920 Epoch 25/50 283/283 [==============================] - 62s 216ms/step - loss: 0.0092 - accuracy: 0.9971 - val_loss: 0.0372 - val_accuracy: 0.9903 Epoch 26/50 283/283 [==============================] - 62s 218ms/step - loss: 0.0106 - accuracy: 0.9972 - val_loss: 0.0379 - val_accuracy: 0.9903 Epoch 27/50 283/283 [==============================] - 61s 213ms/step - loss: 0.0129 - accuracy: 0.9957 - val_loss: 0.0542 - val_accuracy: 0.9900 Epoch 28/50 283/283 [==============================] - 61s 214ms/step - loss: 0.0081 - accuracy: 0.9979 - val_loss: 0.0480 - val_accuracy: 0.9897 Epoch 29/50 283/283 [==============================] - 61s 213ms/step - loss: 0.0103 - accuracy: 0.9975 - val_loss: 0.0486 - val_accuracy: 0.9907 Epoch 30/50 283/283 [==============================] - 61s 214ms/step - loss: 0.0097 - accuracy: 0.9976 - val_loss: 0.0363 - val_accuracy: 0.9923 Epoch 31/50 283/283 [==============================] - 61s 213ms/step - loss: 0.0115 - accuracy: 0.9965 - val_loss: 0.0469 - val_accuracy: 0.9890 Epoch 32/50 283/283 [==============================] - 61s 216ms/step - loss: 0.0097 - accuracy: 0.9975 - val_loss: 0.0403 - val_accuracy: 0.9907 Epoch 33/50 283/283 [==============================] - 62s 218ms/step - loss: 0.0082 - accuracy: 0.9976 - val_loss: 0.0471 - val_accuracy: 0.9900 Epoch 34/50 283/283 [==============================] - 62s 217ms/step - loss: 0.0111 - accuracy: 0.9967 - val_loss: 0.0397 - val_accuracy: 0.9913 Epoch 35/50 283/283 [==============================] - 62s 216ms/step - loss: 0.0114 - accuracy: 0.9978 - val_loss: 0.0368 - val_accuracy: 0.9907 Epoch 36/50 283/283 [==============================] - 62s 218ms/step - loss: 0.0128 - accuracy: 0.9956 - val_loss: 0.0409 - val_accuracy: 0.9893 Epoch 37/50 283/283 [==============================] - 62s 217ms/step - loss: 0.0073 - accuracy: 0.9984 - val_loss: 0.0397 - val_accuracy: 0.9910 Epoch 38/50 283/283 [==============================] - 62s 217ms/step - loss: 0.0098 - accuracy: 0.9970 - val_loss: 0.0329 - val_accuracy: 0.9923 Epoch 39/50 283/283 [==============================] - 62s 218ms/step - loss: 0.0103 - accuracy: 0.9968 - val_loss: 0.0374 - val_accuracy: 0.9907 Epoch 40/50 283/283 [==============================] - 63s 221ms/step - loss: 0.0093 - accuracy: 0.9977 - val_loss: 0.0400 - val_accuracy: 0.9907 Epoch 41/50 283/283 [==============================] - 65s 227ms/step - loss: 0.0140 - accuracy: 0.9972 - val_loss: 0.0317 - val_accuracy: 0.9923 Epoch 42/50 283/283 [==============================] - 62s 217ms/step - loss: 0.0096 - accuracy: 0.9973 - val_loss: 0.0355 - val_accuracy: 0.9917 Epoch 43/50 283/283 [==============================] - 62s 218ms/step - loss: 0.0069 - accuracy: 0.9981 - val_loss: 0.0387 - val_accuracy: 0.9920 Epoch 44/50 283/283 [==============================] - 62s 218ms/step - loss: 0.0088 - accuracy: 0.9981 - val_loss: 0.0382 - val_accuracy: 0.9907 Epoch 45/50 283/283 [==============================] - 65s 228ms/step - loss: 0.0092 - accuracy: 0.9971 - val_loss: 0.0356 - val_accuracy: 0.9920 Epoch 46/50 283/283 [==============================] - 63s 220ms/step - loss: 0.0083 - accuracy: 0.9972 - val_loss: 0.0385 - val_accuracy: 0.9910 Epoch 47/50 283/283 [==============================] - 63s 223ms/step - loss: 0.0098 - accuracy: 0.9965 - val_loss: 0.0414 - val_accuracy: 0.9887 Epoch 48/50 283/283 [==============================] - 63s 221ms/step - loss: 0.0105 - accuracy: 0.9971 - val_loss: 0.0294 - val_accuracy: 0.9930 Epoch 49/50 283/283 [==============================] - 63s 223ms/step - loss: 0.0091 - accuracy: 0.9973 - val_loss: 0.0310 - val_accuracy: 0.9923 Epoch 50/50 283/283 [==============================] - 63s 222ms/step - loss: 0.0104 - accuracy: 0.9970 - val_loss: 0.0346 - val_accuracy: 0.9913
tuned_128_Adam_LR_0_0002.save("res128_best_tuned.h5")
tuned_128_Adam_LR_0_0002.save_weights("res128_best_tuned_weights.h5")
from sklearn.metrics import accuracy_score
from tensorflow.keras.losses import SparseCategoricalCrossentropy
def predict_in_batches(model, dataset, batch_size=32):
predictions = []
true_labels = []
for batch_data, labels in dataset.as_numpy_iterator():
batch_predictions = model.predict(batch_data)
clear_output(wait=True)
predictions.append(batch_predictions)
true_labels.append(labels)
return np.concatenate(predictions, axis=0), np.concatenate(true_labels, axis = 0)
predictions, true_labels = predict_in_batches(tuned_128_Adam_LR_0_0002, std_res128_test, batch_size=32)
scce = SparseCategoricalCrossentropy()
fig = plt.figure(figsize = (20,10))
sns.heatmap(confusion_matrix(true_labels, np.argmax(predictions, axis = 1)), annot = True, xticklabels = class_names, yticklabels = class_names, fmt = "g")
plt.suptitle(f"Final Predictions of model for 128 x 128 images.\n\nAccuracy: {accuracy_score(true_labels, np.argmax(predictions, axis = 1)):.4f}, Loss: {scce(true_labels, predictions).numpy():.4f}")
plt.xlabel('Predicted')
plt.ylabel('True')
plt.show()
The model performs extremely well on the 128 x 128 dataset, capturing most, if not all of the detail in the dataset, and it reflects in the results. With a accuracy of 99.03% on the test dataset, the model is almost perfect. It does make a few mistakes here and there, but that may be due to inconsistencies in the test data.
To conclude, this exercise has allowed me to learn alot more about the field of deep learning, and the considerations that have to be made when attempting to craft a tool for a task. One part where I understand better are the limitations placed on us, especially regarding the greyscaling and resolution constraints. From the above, it can be seen that the models perform better when given more data (94% on 31 x 31, and 99% on 128 x 128). However, a larger amount of imperfect data is already hard to get, given other constraints, let alone data that is perfectly sized, processed and prepared. From this assignment, we can also see the effects of different ways to improve performance, such as class weights, augmentation etc., and how they affect the model in general.
Saving weights.